Anchor Positioning Is Disruptive
New layouts will be possible
The more I play with it, the more convinced I am that anchor positioning is going to unlock some surprising new layouts.
Are we measuring what we meant to measure?
There’s been a lot of interest in the results of the annual State of CSS survey, but are we asking all the right questions?
Sacha Greif’s annual State of CSS survey wrapped up recently, and the results are available.
I’ve seen a number of people commenting on the demographics data, asking what it says about our industry. Josh Comeau and Geoff Graham at CSS Tricks both provide some great commentary. For example: men in the US seem to make significantly more money than the rest of the field. Josh also points out when some questions have a low response rate.
Those are great questions to ask. But I haven’t seen any analysis asking if the survey demographics actually match the industry. Or how that would impact our understanding of other survey answers. For this, I would include a number of demographic questions from other sections of the survey – such as industry sector, CSS usage, and balance of time writing CSS (and HTML) vs JS. Do we know how representative this is?
If you don’t know Kevin Powell’s work teaching CSS, I highly recommend it. But as you might expect his audience is weighted towards students. If he’s a primary source for respondents, that’s likely to impact the demographics, right?
There’s no problem with that – student perspectives are extremely useful! – but it might change how we read the data. With under 10,000 responses, we know we’re looking at a small and self-selected sample of the field. I know several excellent CSS authors who didn’t take the survey. Are some more likely to respond than others? For what reasons? Are these results representative of CSS authors generally? How would we know?
The largest ‘source’ is the survey itself, which likely represents repeat participants. So where did they come from initially? The earliest data I can find is from the 2019 survey, with the State of JS survey as the most popular source. Maybe it’s not surprising if a majority of those respondents write JS as their primary job?
In many cases, polls use demographic data to adjust for response bias – but that requires an understanding of how well the demographics match the larger reality, and also how that is likely to impact other data points. How would it change our interpretation of the survey more generally if we knew the answers to those questions?
The survey does allow us to break down the data in various ways – and run comparisons. I don’t feel qualified to parse out what correlations are significant or relevant. But poking around a bit, a few things jumped out out:
Clearly, I don’t have a lot of answers here. These correlations don’t tell us much without knowing how representative the data is. I’m just not sure what I’m looking at, or how it should be read. But it concerns me that browsers use surveys like this as a primary gauge of developer interest – seemingly without asking who’s represented, or who might be missing from the data.
Did you respond to the survey? What questions do you have around this data? Or what interesting correlations have you found? Let us know on Mastodon or Bluesky.
New layouts will be possible
The more I play with it, the more convinced I am that anchor positioning is going to unlock some surprising new layouts.
Performance, scope, and fallbacks for the anchor positioning polyfill
Our sponsors are supporting the continued development of the CSS Anchor Positioning Polyfill. Here’s a summary of the latest updates.
What makes something a ‘grid’, and what’s at stake?
Back in 2020, Firefox released a prototype for doing ‘masonry’ layout in CSS. Now all the browsers are eager to ship something, but there’s a hot debate about the best syntax to use.