Measuring TV screens like monitors and smartphones is done diagonally. So, when a producer markets a TV as 40 inches, that means it measures 40 inches from one corner to the opposite corner.

For those who use the metric system, one inch is equal to 2.54 centimeters. **So, a 40-inch/101.6-centimeter TV with an aspect ratio of 16:9 would measure 28.2 × 15.7 inches (71.6 × 39.8 centimeters).**

While knowing the size of the TV screen can be helpful in making a purchase decision, if you plan on mounting your TV on a wall, it also helps to know the physical size of the device.

In this guide, I’ll show you how you can measure the size of a 40-inch/101.6-centimeter TV.

**Why Does Screen Size Matter?**

The marketed size of a TV, expressed in inches or centimeters, refers to how wide the screen is from one corner to the opposite corner. So, for instance, a 40-inch TV would measure 40 inches (101.6 centimeters) from the bottom-left corner to the top-right corner.

While the size of the TV—or any gadget with a screen, for that matter—will give you a rough idea of how much wall space it will take up, it’s not an exact science. We’ll have to do a bit of simple mathematics to determine the overall area of the TV before deciding where to mount it.

**Let’s Talk About Aspect Ratios**

Before we get into the actual calculations, there’s something we need to address: aspect ratios.

The aspect ratio of a screen, expressed in #:#, is the ratio of a rectangular display’s width to height measurements.

The most common aspect ratio for TVs and monitors is 16:9, though 21:9 and 4:3 aspect ratios are still widely used.

The aspect ratio does not directly correlate to the screen’s size. In simple terms, it describes the proportion of the screen. We will need to know the aspect ratio to determine how wide and tall a TV screen is, and we use those two figures to determine the screen’s total area.

**Average 40-Inch TV Dimensions**

So, in this example, let’s take a look at a 40-inch/101.6-centimeter TV with a 16:9 aspect ratio. We can input these figures into the following formula to figure out how much space the TV will take up on a wall.

The Pythagorean Theorem (a2 + b2 = c2) can only partially help since we have to take the screen’s aspect ratio into account. After adjusting the formula to include the aspect ratio, here’s what we get:

H = D ÷ √(AR2 + 1)

We can calculate the height of a 40-inch/101.6-centimeter TV screen with a 16:9 aspect ratio in the following way:

- H = 40 ÷ √((16:9)2 + 1))
- H = 40 ÷ √(3.16 + 1)
- H = 40 ÷ √(4.16)
- H = 40 ÷ 2.04
- H = 15.7 inches
- H = 39.8 centimeters

After learning the TV’s height, we can use the following formula to determine the width.

W = AR × H

So, here’s how it’s done:

- W = (16:9) × 15.7
- W = 1.8 × 15.7
- W = 28.2 inches
- W = 71.6 centimeters

Now that we know the dimensions of the 40-inch, 16:9 TV (28.2 × 15.7 inches or 71.6 × 39.8 centimeters), we can figure out the total area of TV screen by multiplying the width by height.

- A = W × H
- A = 28.2 × 15.7
- A = 442.7 square inches
- A = 1,124.5 square centimeters

So, as long as you know the TV’s diagonal measurement (marketed size) and the aspect ratio, figuring out the dimensions and total area of the TV is pretty straightforward.

**What About Resolution?**

There is one common misconception people have regarding the TV’s size and its resolution—namely, a larger screen means better picture quality. While this may be true 99% of the time since larger TVs have more pixels, there isn’t a correlation between the two.

The screen size simply refers to how large the screen is from one corner to the opposite corner. The resolution, on the other hand, describes how many pixels are packed into the screen’s area.

To demonstrate what I mean, let’s compare a 32-inch TV with a 1920 × 1080-pixel resolution against a 40-inch TV with 1280 × 720 pixels. From these extreme examples, we can objectively see that the smaller TV has a higher resolution than the larger TV.

**What About Pixel Density?**

Pixel density is a figure that describes how many pixels are jammed into a given area of the TV screen, which is usually one square inch (pixels per inch or PPI).

A TV’s pixel density will give us a clearer idea of how good the screen looks. Basically, the more pixels packed into a square inch, the more difficult it will be for the viewer to notice individual pixels.

We can calculate the PPI using the following formula:

DiagonalPixels = √(WidthPixels2 × HeightPixels2)

So, using the two TVs from earlier (1080p 32-inch TV vs. 720p 40-inch TV), we would get:

- 32-inch TV DiagonalPixels = √(19202 × 10802)
- 32-inch TV DiagonalPixels = √(3,686,400 × 1,166,400)
- 32-inch TV DiagonalPixels = √4,852,800
- 32-inch TV DiagonalPixels = 2,202.9 pixels

- 40-inch TV DiagonalPixels = √(12802 × 7202)
- 40-inch TV DiagonalPixels = √(1,368,400 × 518,400)
- 40-inch TV DiagonalPixels = √2,156,800
- 40-inch TV DiagonalPixels = 1,468.6 pixels

Now, let’s compare their pixel densities in square inches:

- Pixel Density = DiagonalPixels ÷ DiagonalInches
- 32-inch TV Pixel Density = 2,202.9 ÷ 32
- 32-inch TV Pixel Density = 68.8

- 40-inch TV Pixel Density = 1,468.6 ÷ 40
- 40-inch TV Pixel Density = 36.7

In centimeters, you would get:

- Pixel Density = DiagonalPixels ÷ DiagonalCentimeters
- 32-inch TV Pixel Density = 2,202.9 ÷ 81.3
- 32-inch TV Pixel Density = 27.1

- 40-inch TV Pixel Density = 1,468.6 ÷ 101.6
- 40-inch TV Pixel Density = 14.5

Again, using pixel density, we can determine that the 32-inch-wide TV screen has clearer picture quality than the 40-inch TV objectively

**Conclusion**

In this guide, I described how to take the size of a TV in inches or centimeters to calculate its height, width, and area. To refresh your memory, a 40-inch (101.6-centimeter) TV with an aspect ratio of 16:9 would measure 28.2 × 15.7 inches (71.6 × 39.8 centimeters).

If you found this guide helpful, please let your friends know by sending this article to them. Also, I’d love to hear what TV size and aspect ratio you’re currently rocking in the comments section.

## Leave a Reply