As an aside, it's generally a good idea to use larger chunk sizes to make random data generation run faster.

The Python example below can generate 4GB of random data in about 10s on my laptop:

#!/usr/bin/env python3

import os

CHUNK_SIZE = 2 ** 16
NUMBER = 0x100000000   # 2^32    copies without md5 detected error,  file legnth 4294967296
#NUMBER = 0x400000000  # 4*2^32  copies are erroneous from md5sum, file legnth 17179869184
#NUMBER = 0x200000000  # 2*2^32  copies are erroneous from md5sum, file legnth 8589934592
#NUMBER = 0x100000001  # 2^32+1  copies without md5 detected error, file legnth 4294967297
#NUMBER = 0x180000000  #         copies without md5 detected error, file legnth 6442450944
#NUMBER = 0x1FFFFFFFF

def write_file(length, chunk_size):
    with open('bigfile', 'wb') as outfile:
        for _ in range(length // chunk_size):
            outfile.write(os.urandom(chunk_size))

def main():
    write_file(NUMBER, CHUNK_SIZE)

if __name__ == '__main__':
    main()


On Thu, Apr 28, 2022 at 8:46 PM Kevin Stratton via WLUG <wlug@lists.wlug.org> wrote:
Thank you for reminding me about memtest86, I had a bad memory module. 
I am trying to RMA it now.   It seems unlikely that the memory would go
bad after about a year.  Live and learn....

Ubuntu just updated to 22.04 with the kernel 5.15.0-27-genericOn
4/28/2022 1:56 PM, Chuck Anderson wrote:

> What are the kernel version(s)?  What filesystem?
>
> I suggest:
>
> - Try memtest86+
>
> - Try a live Fedora 35 on a USB stick since all your other tests were
>    relying on Ubuntu, even the Windows 10 which was a VM on Ubuntu.
>    Fedora usually has much newer kernels, so if there is a Linux bug,
>    it might be fixed already.
>
> On Thu, Apr 28, 2022 at 11:55:04AM -0400, Kevin Stratton via WLUG wrote:
>> I have been chasing my tail trying to download a very large to my desktop.  The bug is strange enough that it might be specific to hardware.
>>
>> The issue is that I can not copy very large files accurately.  The threshold value size of the file seems to be about 8589934592.  Any file this size or larger seems to be uncopyable on my desktop.
>>
>> I had downloaded a corrupted file of 71.9GB with Ubuntu 20.04, Ubuntu 22.04, Windows 10 virtua machine running under Ubuntu 20.04,  Manjaro.  All of theses OS;s were running on the same hardware.
>>
>> I had no issues downloading the file from a raspberry pi with and external hard drive.
>>
>> I have included the C file I used to create the files which also has some notes opn the results of the tests.
>>
>>
>>
>> Any advice or help would be appreciated.
_______________________________________________
WLUG mailing list -- wlug@lists.wlug.org
To unsubscribe send an email to wlug-leave@lists.wlug.org
Create Account: https://wlug.mailman3.com/accounts/signup/
Change Settings: https://wlug.mailman3.com/postorius/lists/wlug.lists.wlug.org/
Web Forum/Archive: https://wlug.mailman3.com/hyperkitty/list/wlug@lists.wlug.org/message/H4Y3U25WIXVP7GBHM55YXYRLLXFEC6LF/