gigabytes is actually a measurement of memory. If your computer has 100 Gigabytes that's pretty good. A minute is, as I'm sure you know, a measurement of time. So they are incompatible.
Minutes is a unit of time GB is a unit of space on a hard drive
7 pentasong
The concept of measuring time in gigabytes is not accurate. A gigabyte is a unit of digital storage capacity, not time. It represents 1 billion bytes of data. Time is measured in units such as seconds, minutes, and hours.
That really depends a lot on the video's size and quality. Take a sample of a video in the desired quality, and look how many MB it takes, and for how many minutes it plays. From there, you can extrapolate. (1 GB = 1024 MB)
A Gibibyte (GiB) is a unit of digital information storage equal to 2^30 bytes. One byte is equal to 8 bits, and one bit can represent a binary value of 0 or 1. Therefore, 1 Gibibyte is equivalent to 2^30 * 8 bits. To convert this to minutes, we need to know the data transfer rate in bits per minute to calculate the time it would take to transfer 1 Gibibyte of data.
Minutes is a unit of time GB is a unit of space on a hard drive
These two units are not compatible for conversion; minutes is time, gigabytes (GB) is computer memory.
Gigabytes has no connection with time.
4 million minutes
7 pentasong
You can't compare gigabytes by time. Gigabytes are memory storages.
1080
about 500 minutes
there are 1000 minutes in 1 gb which you can contain lots off stuff: example, songs, pictures etc.
The number of movie minutes that can fit in one gigabyte depends on the quality of the video. For high-definition video, you may fit around 5-8 minutes per gigabyte, while for standard definition video, you may fit around 15-20 minutes per gigabyte.
you can watch Japanese movie on TV.
a lot more than u think it is mate!!