There is no sensible answer to the question. A megabyte is a measure of computer memory whereas 120 minutes is a measure of time. According to basic dimensional analysis conversion from one to the other is not valid.
For example, consider 120 minutes of high quality video and 120 minutes of low quality audio: there will be a huge difference in the memory requirements of the two.
Chat with our AI personalities
120 minutes equals 2 hours
Two hours is equal to 120 minutes. This conversion is based on the fact that there are 60 minutes in one hour. Therefore, to find the total number of minutes in two hours, you would multiply 2 hours by 60 minutes/hour, which equals 120 minutes.
That really depends on the quality of the video. A low-quality video may use 1 megabyte (not gigabyte) per minute, or a few megabytes per minute. A DVD, which is already high quality, has 4.7 gigabytes for a capacity of perhaps a little over 2 hours (120 minutes). A Blu-ray disc, which has a still higher quality, uses about 25 gigabytes for the same playing time.
To convert hours to minutes, you multiply by 60 since there are 60 minutes in an hour. So, 2 hours is equal to 2 x 60 = 120 minutes. Adding 30 minutes to this gives a total of 120 + 30 = 150 minutes. Therefore, 2 hours and 30 minutes is equal to 150 minutes.
60 seconds in 1 minute Hence in 2 minutes, there will be 60 x 2 = 120 seconds.