- Not to be confused with Gigabyte Technology.
A gigabyte is a unit of information or computer storage equal to one billion bytes. It is commonly abbreviated GB in writing (not to be confused with Gb, which is used for gigabit) and gig in writing or speech.
There are two slightly different definitions of the size of a gigabyte in use:
- 1,000,000,000 bytes, equal to 10003 or 109 bytes, is the decimal definition used in telecommunications (such as network speeds) and some computer storage manufacturers (such as hard disks and flash memory drives). This usage is compatible with the International System of Units (SI).
- 1,073,741,824 bytes, equal to 10243 or 230 bytes, is the binary definition typically used for computer memory sizes, and most often used in computer engineering, computer science, and most aspects of computer operating systems. The IEC recommends that this unit should instead be called a gibibyte (abbreviated GiB), as it conflicts with SI units used for bus speeds and the like.
Starting with Mac OS X 10.6 in 2009 and iOS 11 in 2017, Apple stopped using binary values and switched to decimals, causing a "gigabyte" to report 1,000,000,000 bytes instead of 1,073,741,824 bytes, as had been done in the past. This created an illusion of a slight increase in data capacity (memory and storage).[1][2]
References[]
- ↑ Snell, Jason (2009-08-28). Snow Leopard's new maths. Macworld. Archived from the original on 2012-04-02.
- ↑ How iOS and macOS report storage capacity. Apple Support (2018-02-27).