Hiya mates! I've been doing some proper digging on Akamai and their bloody bot detection, and I've stumbled across some really interesting stuff that I wanna share with all of ya. Some of you might already know about it, while some of you might not have a clue. So, sit tight and let me enlighten ya!
Alright mates, let me break it down for ya.
Akamai's basically a tech that helps suss out if website traffic is coming from a real person or a fookin' bot. They do this by using some old school methods like TLS fingerprinting and connection analysis, as well as newer tech like AI models.
So, let's say you're tryin' to access an API that's protected by Akamai. When you first connect to the site, the AI framework kicks in and does its thing on the "Akamai Intelligent Edge Platform". This helps the bot manager see what's what in terms of traffic patterns, types, volumes, etc.
Now, you might be wonderin' how to get around all that shite. Well, one way is to use an automated browser like Selenium. Sure, Akamai can detect it, but there are enough undetected Chrome drivers out there that can get the job done.
But there's more to it than just that. For example, there's TLS fingerprinting. Basically, TLS (SSL) is used to encrypt https connections. It's like agreeing on a secret code with the server. But if your automation program uses a different code than a regular web browser, it can be spotted through TLS fingerprinting. So, to get around this, you need to make sure the libraries and tools you use for HTTP connections are JA3 resistant.
Another thing Akamai looks at is IP address fingerprinting. They can tell if the IP address is residential, mobile, or datacenter. But I reckon most of ya on this forum know how to get around that one.
And then there's HTTP details. Akamai uses the complexity of the HTTP protocol to detect bots. Most of the web uses HTTP2 and HTTP3, but lots of automated web software uses HTTP1.1. That makes it easy to spot bots. Even if newer HTTP libraries like CURL and httpx support HTTP2, Akamai can still detect them using HTTP2 fingerprinting. And don't forget about HTTP request headers. Akamai looks for specific headers used by web browsers but not by automated software. So, you need to make sure your request headers and their order match that of a real web browser and the context of the website. The most sneaky headers can be used to detect if it's a bot or not.
There's actually a lot more to cover, but I'll leave it there for now. Maybe I'll do a part 2 if you show me some love.
Cheers mates!