30 min 476195 sec (30.47min).
___________________
I worked it out as:
100MB = 100*1024*1024*8 = 838,860,800 bits
838,860,800/1024 = 819200 Kb
Thus it will take approximately 819200/56 = 14629 seconds (to the nearest second) = 4.06 hours approximately (if it runs at full speed the whole time)!
[Answer is assuming 56kbps means 56 kilobits per second (not Kilobytes). If you meant 56 kBps then the first answer above would be right.]
Chat with our AI personalities
It depends on which convention is used. Whether a gigabyte is 1024³ bytes or 1000³ bytes. Same with the megabits per second. Note that there are 8 bits in a byte, but transferring could require up to 10 bits per byte, depending on error correction algorithms. This arrives at a range from 340 seconds to 384 seconds, or between 5min 40sec and 6min 24sec.