The concept of converting data storage (megabytes) to time (minutes) is not straightforward as it depends on the type of data being measured. If we assume an average data transfer rate of 1 megabyte per second, then 25 megabytes would take approximately 25 seconds to transfer. However, if we consider streaming video content, which typically uses around 5 megabytes per minute, then 25 megabytes would equate to about 5 minutes of video playback.
9 minutes smartasses
7 HOURS
The answer depends on 800 MB of WHAT: low quality audio or high definition video?
There is no sensible answer to the question. A megabyte is a measure of computer memory whereas 120 minutes is a measure of time. According to basic dimensional analysis conversion from one to the other is not valid. For example, consider 120 minutes of high quality video and 120 minutes of low quality audio: there will be a huge difference in the memory requirements of the two.
That really depends a lot on the video's size and quality. Take a sample of a video in the desired quality, and look how many MB it takes, and for how many minutes it plays. From there, you can extrapolate. (1 GB = 1024 MB)
25 gigabytes = 25,600 megabytes
9 minutes smartasses
A gigabyte is 1,000 megabytes. So there are 25,000 in 25 gigabytes.
how many minutes to make 1mb
7 HOURS
25MB
There are 131,072 bytes in one megabite. In order to figure out how many bytes are in 25 megabites you need to multiply 131,072 by 25. Using this equasion you can figure out that 3,276,800 bytes are in 25 megabites.
It depends on the quality.
around 300...
You are comparing two entirely unrelated units of measure. You might as well ask how many days there are in PI for all the sense it makes to compare minutes to megabytes.
3:11 sec.
None. The units are incompatible.