You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I would like quorum to use 3TB of memory on our server. I ended up with several core dumps or at the best errors on the commandline. The "-s 3T" is not recognized, probably untested.
$ quorum -t 104 -s 2.5T -k 31 myfile.fastq
Invalid size '2.5T'. It must be a number, maybe followed by a suffix (like k, M, G for thousand, million and billion).
$
I have 323975447 Illumina sequences in the file (paired-end sequences, interleaved, some are probably singletons).
$ quorum -t 104 -s 2500G -k 31 myfile.fastq
terminate called after throwing an instance of 'jellyfish::large_hash::array_base<jellyfish::mer_dna_ns::mer_base_static<unsigned long, 0>, unsigned long, atomic::gcc, jellyfish::large_hash::array<jellyfish::mer_dna_ns::mer_base_static<unsigned long, 0>, unsigned long, atomic::gcc, allocators::mmap> >::ErrorAllocation'
what(): Failed to allocate 9000000000000 bytes of memory
Creating the mer database failed. Most likely the size passed to the -s switch is too small. at /apps/gentoo/usr/bin/quorum line 143.
$
Is the memory specified on the commandline multiplied by number of threads? I do not understand where 9TB comes from.
The core dump says it was generated by quorum_create_database -s 2500G -m 31 -t 104 -q 38 -b 7 -o
(gdb) bt full
#0 0x00002aaaab776124 in raise () from /apps/gentoo/lib64/libc.so.6
No symbol table info available.
#1 0x00002aaaab77758a in abort () from /apps/gentoo/lib64/libc.so.6
No symbol table info available.
#2 0x00002aaaab1e2ecd in __gnu_cxx::__verbose_terminate_handler() () at /apps/gentoo/var/tmp/portage/sys-devel/gcc-5.4.0-r3/work/gcc-5.4.0/libstdc++-v3/libsupc++/vterminate.cc:95
No locals.
#3 0x00002aaaab1e0d06 in __cxxabiv1::__terminate(void (*)()) () at /apps/gentoo/var/tmp/portage/sys-devel/gcc-5.4.0-r3/work/gcc-5.4.0/libstdc++-v3/libsupc++/eh_terminate.cc:47
No locals.
#4 0x00002aaaab1e0d51 in std::terminate() () at /apps/gentoo/var/tmp/portage/sys-devel/gcc-5.4.0-r3/work/gcc-5.4.0/libstdc++-v3/libsupc++/eh_terminate.cc:57
No locals.
#5 0x00002aaaab1e0f68 in __cxa_throw () at /apps/gentoo/var/tmp/portage/sys-devel/gcc-5.4.0-r3/work/gcc-5.4.0/libstdc++-v3/libsupc++/eh_throw.cc:87
No locals.
#6 0x0000000000406a7b in main () at /apps/gentoo/usr/include/jellyfish/large_hash_array.hpp:180
args = {size_arg = 2500000000000, size_given = true, mer_arg = 31, mer_given = true, bits_arg = 7, bits_given = true, min_qual_value_arg = 38, min_qual_value_given = true, min_qual_char_arg = {<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >> = {static npos =
18446744073709551615, _M_dataplus = {<std::allocator<char>> = {<__gnu_cxx::new_allocator<char>> = {<No data fields>}, <No data fields>}, _M_p = 0x616c98 ""}, _M_string_length = 0, {_M_local_buf = '\000' <repeats 15 times>, _M_allocated_capacity = 0}}, <No data fields>}, min_qual_char_given = false,
threads_arg = 104, threads_given = true, output_arg = 0x7fffffffd6ec "quorum_corrected_mer_database.jf", output_given = true, reprobe_arg = 126, reprobe_given = false, reads_arg = {<std::_Vector_base<char const*, std::allocator<char const*> >> = {
_M_impl = {<std::allocator<char const*>> = {<__gnu_cxx::new_allocator<char const*>> = {<No data fields>}, <No data fields>}, _M_start = 0x628fa0, _M_finish = 0x628fa8, _M_end_of_storage = 0x628fa8}}, <No data fields>}}
std::__ioinit = {static _S_refcount = 10, static _S_synced_with_stdio = true}
jellyfish::mer_dna_ns::mer_base_static<unsigned long, 0>::k_ = 31
(gdb)
Thank you,
The text was updated successfully, but these errors were encountered:
Hi,
I would like quorum to use 3TB of memory on our server. I ended up with several core dumps or at the best errors on the commandline. The "-s 3T" is not recognized, probably untested.
I have 323975447 Illumina sequences in the file (paired-end sequences, interleaved, some are probably singletons).
Is the memory specified on the commandline multiplied by number of threads? I do not understand where 9TB comes from.
The core dump says it was generated by
quorum_create_database -s 2500G -m 31 -t 104 -q 38 -b 7 -o
Thank you,
The text was updated successfully, but these errors were encountered: