I’ve actually noticed this in some websites the past ~two months. It’s neat to have a captcha that finally doesn’t need slowly clicking images to pass through.
Second, we find that a few privacy-focused users often ask their browsers to go beyond standard practices to preserve their anonymity. This includes changing their user-agent (something bots will do to evade detection as well), and preventing third-party scripts from executing entirely. Issues caused by this behavior can now be displayed clearly in a Turnstile widget, so those users can immediately understand the issue and make a conscientious choice about whether they want to allow their browser to pass a challenge.
Those of you that browse the internet with JS disabled (e.g. using NoScript), the time of reckoning has finally come. A huge swatch of internet will no longer be accessible without enabling javascript.
As a web developer who’s worked in the industry for 16 years, every snowflake requiring me to work harder to support their “choices” is just an annoyance. I get wanting to reduce tracking etc, but in all honesty, the 0.0X% of users running tons of blockers and JS off are in reality just easier to track, in comparison to hiding in the mass of regular users who might be running an ad blocker (or nothing).
As long as your browser is making requests, you’ll never be invisible.
The change needs to come from regulation level imho.
Couldn’t agree more.
It’s great you can do it and you’re free to, but not using javascript often means revamping the whole codebase and making everything 5x more complicated.
Which just won’t happen to make 6 users happy
Amen. We do provide text versions though, but a few JS-blocking users have complained about having a barebones experience.
but a few JS-blocking users have complained about having a barebones experience.
Well no shit, have they ever wondered why the language was created in the first place?
It’s a god damn funny though.
Ya, I feel like disabling Javascript should not be “beyond standard practice”.
Mull with RethinkDNS on mobile, ‘cool, so the internet just became less accessible’.
I just tested my favourite cloudflare-blocked site and it still hangs on “verifying the security of your connection” in my figerprinting-resistant browser profile.
Yeah I get infinite loops on half the Internet. It’s infuriating and should be illegal for them to deny my as a customer just because they can’t track me
How does any of this fit into the reality that you can pay $1 per 1000 captchas for a real, actual human to solve them? It seems like so much effort is put into this cat&mouse narrative with bot makers, ignoring the reality that sometimes labour is actually much cheaper.
It’s about creating at least a small barrier for not-very motivated people.
If a script kiddie wants to create a couple accounts and spam a bit, paying for and integrating such a service might just discourage them from actually taking the time.
Just a small cost if you’re dedicated though, for sure
Given that it gets rid of captchas, it neatly evades that issue.
Their goal wasn’t to improve bot blocking, though, but to deter real people less and bots just as much, and it seems they’ve achieved that.
Bots definitely can check a box, and they can even mimic the erratic path of human mouse movement
Damn I didn’t know that was being tracked too
Have you ever clicked a captcha and it’s just checked itself off for you?
That’s because your page use behaviour looked human enough it wasn’t worth the robot test
It has happened on rare occasions. Most of the time, no. But I didn’t think they had access to the mouse cursor trajectory.
Yes your browser tracks all of this, movement, hover, clicks etc. It’s how pages are able to respond to various mouse gestures.
Cloudflare MITMs a good portion of internet traffic. They can even see inside SSL tunnels for most websites you visit. It’s an absolute privacy nightmare.
But how it works? I don’t see any explanation on the post nor CF web site. It looks magical.
For Turnstile, the actual act of checking a box isn’t important, it’s the background data we’re analyzing while the box is checked that matters. We find and stop bots by running a series of in-browser tests, checking browser characteristics, native browser APIs, and asking the browser to pass lightweight tests (ex: proof-of-work tests, proof-of-space tests) to prove that it’s an actual browser.
But…. Lots of bots are made with RPAs …. With actual browsers , interface emulating human interaction. Sounds like a response to https://proton.me/blog/proton-captcha
Cloudflare is rather late with this to be honest, Google has had interaction free reCAPTCHA for ages.
User simulation is something these automated tools are designed to detect. It’s also why attempts to remove identification mechanisms are treated with more suspicion.
For example, very few bots actually have real, usable GPUs, relying on software rendering instead. This can be detected and kept in mind with a bunch of other signals. Running Selenium on your desktop will make you hard to detect, but running it in the cloud (even when proxied through a botnet like the big scrapers) will make bots quite obvious.
I think CAPTCHA is fighting a losing battle, and I think in the future remote attestation technology will determine whether you have access to certain websites or not. This technology has already been built into Safari and it’s on its way to becoming an internet standard, so I kind of expect CAPTCHAs to disappear in a couple of years.
Turnstile was announced over a year ago.
I don’t know shit about it.
Thank you. I didn’t see this part. I guess its kind of like their privacy pass stuff.