7 HOURS
There is no sensible answer to the question. A megabyte is a measure of computer memory whereas 120 minutes is a measure of time. According to basic dimensional analysis conversion from one to the other is not valid. For example, consider 120 minutes of high quality video and 120 minutes of low quality audio: there will be a huge difference in the memory requirements of the two.
The concept of converting data storage (megabytes) to time (minutes) is not straightforward as it depends on the type of data being measured. If we assume an average data transfer rate of 1 megabyte per second, then 25 megabytes would take approximately 25 seconds to transfer. However, if we consider streaming video content, which typically uses around 5 megabytes per minute, then 25 megabytes would equate to about 5 minutes of video playback.
That really depends a lot on the video's size and quality. Take a sample of a video in the desired quality, and look how many MB it takes, and for how many minutes it plays. From there, you can extrapolate. (1 GB = 1024 MB)
MB which stands for mega bytes is not a measure of time. It is a measure of computer space. 50 MB is the size of about the size of two short songs if in the right format.
1 Gigabyte is approx. 1 million Kilobytes, so 0.8 Gigabytes
76 800 MB
how many minutes to make 1mb
7 HOURS
.82 GB is 839.68 MB(.82X1024)
It depends on the quality.
You are comparing two entirely unrelated units of measure. You might as well ask how many days there are in PI for all the sense it makes to compare minutes to megabytes.
100+ if compacted for the 2Mb and 800+ for the 4Gb
3:11 sec.
None. The units are incompatible.
A 650 MB CD-R can hold 74 minutes of music. A 700 MB CD-R can hold 80 minutes of music.
0.8 Megabytes