Before Windows 10, NVidia and others had this button Detect what thing suits me best on their websites. Now many of them just look it up in one’s fingerprint without asking.
Fuckin oath. If we cater to the stupid too much the folks who are middling just get lazy. Make people think. It’s important that we know how to use our brains.
It’s like undefined behavior - most people usually do one thing, but it’s perfectly acceptable for me to make my website as hard-to-use as I want. Ctrl-click the website logo to submit the form.
Microsoft hides their links if they see you run linux. So you need to manually set your OS in the browser settings to see the download link. Very convenient.
That’s a fair perspective, but most people strive for as few clicks between users and their targets as possible. Forcing a user to become semi-tech-competent by sending them on a fetch quest to figure out their os, while not an inherently bad thing, does work against this overall goal…
Idk, it’s like education vs service industry goal setting, that’s all I’m trying to get at here lol
Edit: plus, there’s no guarantee that it will remain just the big 3 for forever. There was a time before Linux, maybe we’ll see a time after windows… Unlikely, but one can dream lol
Ideally, to save bandwidth on both sides, the server would only want to serve you the JS and CSS you need. I’m not sure how frequently that optimization is made, however.
I’m a bit rusty on this, but I think you’d need to split your Sass/SCSS/etc before Webpack will perform tree-shaking or allow lazy-loading. I don’t think many devs wrote it that way: personally, I like my mobile rules beside my desktop ones, since my styling is component-wise.
Fair point, there could be reasons, and I’d say there’s no privacy concerns if that’s all they get, but I know it’s part of fingerprinting. I said 99% so they don’t even need to know that
as a front end web developer, I’ve found it useful to know what user agent is requesting a page in order to load conditional styling. For example, to compensate for Safari’s god-awful outlines support (pre-version 16).
The biggest offender is, surprisingly, cloudflare. They will straight up refuse to serve you any site if your user agent is not one of the mainstream ones. It’s not even “find the traffic light to prove you’re human”, but a page basically saying “fuck you, go away”.
what is more likely to be a bot? a unique and trackable useragent for a semi-niche browser engine, or a vanilla Chromium+Windows which half of everyone uses ?
what about malicious/unwanted bots? if cloudflare is trying to block bots, the bots will want to not look like bots. the easiest way to do that is to use a common user agent.
If I was a Firefox dev I’d start looking into building in user agent spoofing right into the browser.
It already opens Facebook pages in a special isolated tab. They could have apple.com open in it’s own special “safari” tab. I wonder if there’s anything preventing them from doing that. I guess it could be bad because it would make their market share appear even smaller.
Broken webpages might be a good thing. There are too many browsers that aren’t adhering to standards. Stop coding around it and start publicly shaming these megacorps.
JavaScript as it is today also need to be thrown in a trash of history.
Website should not contain additional code. If someone wants to send me an app hacked on top of website rendering, it should be a popup asking me first if I want to run this.
No problem with sending some JavaScript module extending browser’s capability. But the problem I see is sending whole sites this way, sometimes even rendering HTML on the visitor’s browser, yack…
This is absolutely not true and just a myth.
Images, video playback, “show more”, forms, tabbing, animations, custom icons, hover effects, popups, background images and videos, light/dark mode, hamburger menus…
It’s hard to count things you can do with advanced format that is HTML+CSS. Saying JavaScript is nessesary for anything other than block of text is like saying that in Minecraft command blocks are nessesary for anything other than making voxel art.
For basic things like interacting with your bank or goverment, running any additional code should be unnessesary. And I believe this needs to be a law targeting accessibility and compatibility.
For maps, dynamic updating, OK.
But look at the web now, most sites are apps requiring 99% of web standards implemented to work. No wonder it’s now impossible to actually make a new browser.
HTML was made to last. If browser do not support some tag it would try and render it anyway. Meanwhile with today’s webapps browsers in 2033 will be required to have so much technical debt that for now was exclusive to operating systems.
i don’t want them knowing desktop or mobile either. we all have good enough phones now to handle a proper website on mobile – mobile sites are fucking garbage.
steve jobs during the original iphone keynote did a whole segment on how you could load the full rich widescreen NYT website and zoom in and out and look at that rich text rendering. apps are ass, mobile sites are ass.
It’s time to get rid of user-agent strings that declare anything other than desktop, mobile, or html version.
99% of sites only need to know your screen aspect ratio and maybe available input devices, can’t think of a good reason to share anything else
Knowing OS is useful for download links.
I’d be down for an ask to allow that info. Sort of like how sites request access to cam and mic.
Before Windows 10, NVidia and others had this button Detect what thing suits me best on their websites. Now many of them just look it up in one’s fingerprint without asking.
Oh no, they’d have to list more than one link,the horror!
The vast majority of people would have no clue what to download.
Let them be confused. They’ll learn eventually. Or they won’t. Computers are too user friendly today anyway.
Fuckin oath. If we cater to the stupid too much the folks who are middling just get lazy. Make people think. It’s important that we know how to use our brains.
No one ever promised an easy to use internet.
It’s like undefined behavior - most people usually do one thing, but it’s perfectly acceptable for me to make my website as hard-to-use as I want. Ctrl-click the website logo to submit the form.
Microsoft hides their links if they see you run linux. So you need to manually set your OS in the browser settings to see the download link. Very convenient.
having 3 different ones solves that issue though? the user can figure out whic OS they’re running pretty well imo.
I can tell you’ve never had to do T1 tech support before.
It’s kind of staggering just how illiterate users can be.
I doubt the fix is to make them need less literacy
When you are competing for customers not providing the illiterate morons of the world a simple UI leads to them going to your competitor which does.
And unfortunately those illiterate morons outnumber every one else by a significant chunk.
That’s a fair perspective, but most people strive for as few clicks between users and their targets as possible. Forcing a user to become semi-tech-competent by sending them on a fetch quest to figure out their os, while not an inherently bad thing, does work against this overall goal…
Idk, it’s like education vs service industry goal setting, that’s all I’m trying to get at here lol
Edit: plus, there’s no guarantee that it will remain just the big 3 for forever. There was a time before Linux, maybe we’ll see a time after windows… Unlikely, but one can dream lol
deleted by creator
Since we have CSS what would be the purpose of the server knowing the aspect ratio?
Ideally, to save bandwidth on both sides, the server would only want to serve you the JS and CSS you need. I’m not sure how frequently that optimization is made, however.
I’m a bit rusty on this, but I think you’d need to split your Sass/SCSS/etc before Webpack will perform tree-shaking or allow lazy-loading. I don’t think many devs wrote it that way: personally, I like my mobile rules beside my desktop ones, since my styling is component-wise.
I haven’t done UI work in years so I’m not sure how they do it these days.
Fair point, there could be reasons, and I’d say there’s no privacy concerns if that’s all they get, but I know it’s part of fingerprinting. I said 99% so they don’t even need to know that
that’s how css gets its media queries, user agents
deleted by creator
as a front end web developer, I’ve found it useful to know what user agent is requesting a page in order to load conditional styling. For example, to compensate for Safari’s god-awful outlines support (pre-version 16).
The biggest offender is, surprisingly, cloudflare. They will straight up refuse to serve you any site if your user agent is not one of the mainstream ones. It’s not even “find the traffic light to prove you’re human”, but a page basically saying “fuck you, go away”.
Well their job is to block weird bot-looking traffic…
what is more likely to be a bot? a unique and trackable useragent for a semi-niche browser engine, or a vanilla Chromium+Windows which half of everyone uses ?
Most semi and fully legitimate bots use a custom user agent.
what about malicious/unwanted bots? if cloudflare is trying to block bots, the bots will want to not look like bots. the easiest way to do that is to use a common user agent.
User agent identifier is not useful to block bots. You can literally set it to whatever you like.
If I was a Firefox dev I’d start looking into building in user agent spoofing right into the browser.
It already opens Facebook pages in a special isolated tab. They could have apple.com open in it’s own special “safari” tab. I wonder if there’s anything preventing them from doing that. I guess it could be bad because it would make their market share appear even smaller.
The irony of Firerfox officially agent spoofing while everyone else uses some variant of “Mozilla” as their UAS is too much.
I think user agent scrambling is part of privacy.resistFingerprinting, but it’s a controversial feature and breaks a lot of webpages
Broken webpages might be a good thing. There are too many browsers that aren’t adhering to standards. Stop coding around it and start publicly shaming these megacorps.
https://webaim.org/blog/user-agent-string-history/
kek
That’s was interesting to read.
That article is great! I have it linked on my website next to the text that displays the user agent of the user.
User agents are not unfortunately not the only way to identify a browser, there are other ways to fingerprint a platform.
JavaScript as it is today also need to be thrown in a trash of history. Website should not contain additional code. If someone wants to send me an app hacked on top of website rendering, it should be a popup asking me first if I want to run this.
No, dynamic content should absolutely be able to be delivered through the open Web, not just through walled gardens. Apps are almost universally shit.
No problem with sending some JavaScript module extending browser’s capability. But the problem I see is sending whole sites this way, sometimes even rendering HTML on the visitor’s browser, yack…
That’s a terrible idea. Every single thing other than a block of text requires js.
This is absolutely not true and just a myth. Images, video playback, “show more”, forms, tabbing, animations, custom icons, hover effects, popups, background images and videos, light/dark mode, hamburger menus…
It’s hard to count things you can do with advanced format that is HTML+CSS. Saying JavaScript is nessesary for anything other than block of text is like saying that in Minecraft command blocks are nessesary for anything other than making voxel art.
For basic things like interacting with your bank or goverment, running any additional code should be unnessesary. And I believe this needs to be a law targeting accessibility and compatibility.
For maps, dynamic updating, OK. But look at the web now, most sites are apps requiring 99% of web standards implemented to work. No wonder it’s now impossible to actually make a new browser.
HTML was made to last. If browser do not support some tag it would try and render it anyway. Meanwhile with today’s webapps browsers in 2033 will be required to have so much technical debt that for now was exclusive to operating systems.
i don’t want them knowing desktop or mobile either. we all have good enough phones now to handle a proper website on mobile – mobile sites are fucking garbage.
steve jobs during the original iphone keynote did a whole segment on how you could load the full rich widescreen NYT website and zoom in and out and look at that rich text rendering. apps are ass, mobile sites are ass.
especially when they don’t even have all of the features of the desktop site
The number of sites that aggressively disable the force pinch to zoom accessibility feature is too damn high