Minutes is a unit of time
GB is a unit of space on a hard drive
Chat with our AI personalities
7 pentasong
The concept of measuring time in gigabytes is not accurate. A gigabyte is a unit of digital storage capacity, not time. It represents 1 billion bytes of data. Time is measured in units such as seconds, minutes, and hours.
A Gibibyte (GiB) is a unit of digital information storage equal to 2^30 bytes. One byte is equal to 8 bits, and one bit can represent a binary value of 0 or 1. Therefore, 1 Gibibyte is equivalent to 2^30 * 8 bits. To convert this to minutes, we need to know the data transfer rate in bits per minute to calculate the time it would take to transfer 1 Gibibyte of data.
gigabytes is actually a measurement of memory. If your computer has 100 Gigabytes that's pretty good. A minute is, as I'm sure you know, a measurement of time. So they are incompatible.
That really depends on the quality of the video. A low-quality video may use 1 megabyte (not gigabyte) per minute, or a few megabytes per minute. A DVD, which is already high quality, has 4.7 gigabytes for a capacity of perhaps a little over 2 hours (120 minutes). A Blu-ray disc, which has a still higher quality, uses about 25 gigabytes for the same playing time.