answersLogoWhite

0

30 min 476195 sec (30.47min).

___________________

I worked it out as:

100MB = 100*1024*1024*8 = 838,860,800 bits

838,860,800/1024 = 819200 Kb

Thus it will take approximately 819200/56 = 14629 seconds (to the nearest second) = 4.06 hours approximately (if it runs at full speed the whole time)!

[Answer is assuming 56kbps means 56 kilobits per second (not Kilobytes). If you meant 56 kBps then the first answer above would be right.]

User Avatar

Wiki User

13y ago

Still curious? Ask our experts.

Chat with our AI personalities

JudyJudy
Simplicity is my specialty.
Chat with Judy
SteveSteve
Knowledge is a journey, you know? We'll get there.
Chat with Steve
DevinDevin
I've poured enough drinks to know that people don't always want advice—they just want to talk.
Chat with Devin
More answers

It depends on which convention is used. Whether a gigabyte is 1024³ bytes or 1000³ bytes. Same with the megabits per second. Note that there are 8 bits in a byte, but transferring could require up to 10 bits per byte, depending on error correction algorithms. This arrives at a range from 340 seconds to 384 seconds, or between 5min 40sec and 6min 24sec.

User Avatar

Wiki User

12y ago
User Avatar

3.46 days

User Avatar

Wiki User

12y ago
User Avatar

Add your answer:

Earn +20 pts
Q: How long will it take to transfer a2GB file at 56 Mbps?
Write your answer...
Submit
Still have questions?
magnify glass
imp