You should be questioning the game developers if they want to implement server side solutions instead of installing rootkits on users PCs and dictating what settings they should use.
Fuck off Eurogamer. No game should require any sort of kernel level access or setting change on your PC.
Machine learning. Oh this player did this impossible move more than once, maybe we should flag that.
Valve have been doing it for more than a decade. Now imagine what others could do, they are so caught up on “AI”, but wont try to use it for anything it could actually be useful for.
You can’t tell with client side either, so that’s a null argument. Anti-cheat is always bypassed, most good cheats don’t even run on the same device anymore, completely circumventing any kernel anti-cheat anyway.
On the server, they have all the data of where a player could be, what they could see, what they could hear, what human mouse movement looks like etc. that can all be used to target cheaters in a way they cannot get around. Player reporting would still exist of course for any other edge cases.
Client side anti-cheat has more data than server-side, because that is where the player’s actual screen, mouse and keyboard are.
The cheat only uses data available on the client - obviously - so the extra data about game state on the server is irrelevant.
“ML” is also not relevant. It doesn’t make the server any more able to make up for the data it doesn’t have. It only forces cheats to try and make realistic inputs, which they already do. And it ends up meaning that you don’t understand the decisions your anti-cheat model is making, so the inevitable false positives will cause a stink because you can’t justify them.
It doesn’t have to extinguish 99% of cheaters, hell, it doesn’t even need to extinguish cheating all together. It just has to make the problem manageable and invisible to players. That’s something server side can achieve. I’ll take the odd game with a cheater in if my entire PC isn’t ransom to some random company.
If cheaters exist but can only do it in a way that makes them look like a real player, then it doesn’t really effect the game anymore and the problem isn’t visible to players. You are never going to get rid of cheaters, even at LAN they have injected software in the past. It’s a deeper problem than we can solve with software.
Client-side AC has proven futile over and over again, even today with all the kernel AC. As I already said: most good cheats don’t even run on the same device anymore, completely circumventing any kernel (client side) anti-cheat anyway.
Why be allergic to trying something new? Something that isn’t invasive, a massive security threat or controlling of your own personal system.
It doesn’t have to extinguish 99% of cheaters, but if it affects 1% of legitimate players that’s a big problem. Good luck tuning your ML to have a less than 1% false positive rate while still doing anything.
Good luck tuning your ML to have a less than 1% false positive rate while still doing anything.
Already exists with VACnet in the largest competitive FPS, Counter-Strike. And machine learning has grown massively in the last couple years, as you probably know with all the “AI” buzz.
I mean, Valve could explicitly say that they have some trusted hardware and software stack or something and let games know whether the environment’s been modified.
That’d require support from Valve and be about the only way that you could have both a way to run in locked down mode for multiplayer games where addressing cheating is a problem (and where I think the “closed console system” model is probably mote appropriate and the “open PC model” is at best kludged into kimda-sorta working like a console) and also let the system still run in an “open mode”.
My own approach is just to not play most multiplayer competitive games on PCs. I’ve enjoyed them in the past, but for anything seriously reflex-oriented like FPSes, your reflexes go downhill with age anyway. And they come with all kinds of issues, even on a locked-down system that successfully avoids cheating. People griefing. You can’t generally pause the game to use the toilet, deal with a screaming kid, or answer the door. The other players, unlike game AIs, aren’t necessarily going to be optimized to play a “fun” game for me. You don’t need an Internet connection, and being in a remote area isn’t a limiting factor.
I think that the future is gonna be shifting towards better game AIs. Hard technical problems to solve there, but it’s a ratchet — we only get better over time.
The burden should be on the developers and a server side solution. No PC should be invaded with software to stop cheating. It’s cat and mouse anyway with client side detection, by chasing it so hard they are just incentivizing the creation of less and less detectable cheats.
The whole “its an untampered system” thing doesnt work. It’s like Secure Boot now randomly being required in games. No user should have to enable or disable anything like that just to run a game. It’s their device, they should have the freedom to do what they want and still run an application.
I think the invasion of bots in games is ruining them personally, no matter how old I get, or how bad I get at them, I still want to play against real players. I wouldnt mind a mode with just AI for people, but they should never be mixed in with real players.
The burden should be on the developers and a server side solution.
There are some fundamental limitations on what you can do with purely server-side solutions. If you’re playing online card games, sure, you can do viable pure server-side stuff to resist most cheating. That’ll get everything short of using, say, a calculator to compute probabilities or count cards or something.
But with, say, FPSes, that’s not really practical. You need to have some information on the client that the player shouldn’t be privy to to mitigate things like latency. For example, if another player runs around the edge of a wall and becomes visible, your client needs to know that it’s behind the wall and rounding the corner to rapidly show the opposing character becoming visible. And that means trusting client side code to not be running a wallhack. And that entails trusted hardware to do reliably, and that can’t be done by the game developer — it’s gotta have support from Valve if you want that.
It’s practical, VACnet has existed for over a decade. It might not be perfect, but it’s a start and any company serious about anti-cheat could take that premise further.
The downside is that cheaters have to play at least a game before they are detected. Client side stuff is better for initial prevention, but even that’s becoming trivial as most good cheats dont even run on the same computer as the game anymore, circumventing all AC software anyway. If your game costs money to play, that’s already one of the biggest hurdles, so prevention isn’t worth chasing at the expense of privacy and security of users.
Any downsides from server-side are nothing in comparison to the downsides of client side anti-cheat.
What kind of shit question is that???
You should be questioning the game developers if they want to implement server side solutions instead of installing rootkits on users PCs and dictating what settings they should use.
Fuck off Eurogamer. No game should require any sort of kernel level access or setting change on your PC.
How can you implement server-side anti-cheat?
Machine learning. Oh this player did this impossible move more than once, maybe we should flag that.
Valve have been doing it for more than a decade. Now imagine what others could do, they are so caught up on “AI”, but wont try to use it for anything it could actually be useful for.
How do you tell the difference between someone with a good aimbot (that simulates real input) and someone who’s just really good?
You can’t (server side).
Very easily, that’s what machine learning is for.
You can’t tell with client side either, so that’s a null argument. Anti-cheat is always bypassed, most good cheats don’t even run on the same device anymore, completely circumventing any kernel anti-cheat anyway.
On the server, they have all the data of where a player could be, what they could see, what they could hear, what human mouse movement looks like etc. that can all be used to target cheaters in a way they cannot get around. Player reporting would still exist of course for any other edge cases.
Client side anti-cheat has more data than server-side, because that is where the player’s actual screen, mouse and keyboard are.
The cheat only uses data available on the client - obviously - so the extra data about game state on the server is irrelevant.
“ML” is also not relevant. It doesn’t make the server any more able to make up for the data it doesn’t have. It only forces cheats to try and make realistic inputs, which they already do. And it ends up meaning that you don’t understand the decisions your anti-cheat model is making, so the inevitable false positives will cause a stink because you can’t justify them.
It doesn’t have to extinguish 99% of cheaters, hell, it doesn’t even need to extinguish cheating all together. It just has to make the problem manageable and invisible to players. That’s something server side can achieve. I’ll take the odd game with a cheater in if my entire PC isn’t ransom to some random company.
If cheaters exist but can only do it in a way that makes them look like a real player, then it doesn’t really effect the game anymore and the problem isn’t visible to players. You are never going to get rid of cheaters, even at LAN they have injected software in the past. It’s a deeper problem than we can solve with software.
Client-side AC has proven futile over and over again, even today with all the kernel AC. As I already said: most good cheats don’t even run on the same device anymore, completely circumventing any kernel (client side) anti-cheat anyway.
Why be allergic to trying something new? Something that isn’t invasive, a massive security threat or controlling of your own personal system.
It doesn’t have to extinguish 99% of cheaters, but if it affects 1% of legitimate players that’s a big problem. Good luck tuning your ML to have a less than 1% false positive rate while still doing anything.
Already exists with VACnet in the largest competitive FPS, Counter-Strike. And machine learning has grown massively in the last couple years, as you probably know with all the “AI” buzz.
I mean, Valve could explicitly say that they have some trusted hardware and software stack or something and let games know whether the environment’s been modified.
That’d require support from Valve and be about the only way that you could have both a way to run in locked down mode for multiplayer games where addressing cheating is a problem (and where I think the “closed console system” model is probably mote appropriate and the “open PC model” is at best kludged into kimda-sorta working like a console) and also let the system still run in an “open mode”.
My own approach is just to not play most multiplayer competitive games on PCs. I’ve enjoyed them in the past, but for anything seriously reflex-oriented like FPSes, your reflexes go downhill with age anyway. And they come with all kinds of issues, even on a locked-down system that successfully avoids cheating. People griefing. You can’t generally pause the game to use the toilet, deal with a screaming kid, or answer the door. The other players, unlike game AIs, aren’t necessarily going to be optimized to play a “fun” game for me. You don’t need an Internet connection, and being in a remote area isn’t a limiting factor.
I think that the future is gonna be shifting towards better game AIs. Hard technical problems to solve there, but it’s a ratchet — we only get better over time.
The burden should be on the developers and a server side solution. No PC should be invaded with software to stop cheating. It’s cat and mouse anyway with client side detection, by chasing it so hard they are just incentivizing the creation of less and less detectable cheats.
The whole “its an untampered system” thing doesnt work. It’s like Secure Boot now randomly being required in games. No user should have to enable or disable anything like that just to run a game. It’s their device, they should have the freedom to do what they want and still run an application.
I think the invasion of bots in games is ruining them personally, no matter how old I get, or how bad I get at them, I still want to play against real players. I wouldnt mind a mode with just AI for people, but they should never be mixed in with real players.
There are some fundamental limitations on what you can do with purely server-side solutions. If you’re playing online card games, sure, you can do viable pure server-side stuff to resist most cheating. That’ll get everything short of using, say, a calculator to compute probabilities or count cards or something.
But with, say, FPSes, that’s not really practical. You need to have some information on the client that the player shouldn’t be privy to to mitigate things like latency. For example, if another player runs around the edge of a wall and becomes visible, your client needs to know that it’s behind the wall and rounding the corner to rapidly show the opposing character becoming visible. And that means trusting client side code to not be running a wallhack. And that entails trusted hardware to do reliably, and that can’t be done by the game developer — it’s gotta have support from Valve if you want that.
It’s practical, VACnet has existed for over a decade. It might not be perfect, but it’s a start and any company serious about anti-cheat could take that premise further.
The downside is that cheaters have to play at least a game before they are detected. Client side stuff is better for initial prevention, but even that’s becoming trivial as most good cheats dont even run on the same computer as the game anymore, circumventing all AC software anyway. If your game costs money to play, that’s already one of the biggest hurdles, so prevention isn’t worth chasing at the expense of privacy and security of users.
Any downsides from server-side are nothing in comparison to the downsides of client side anti-cheat.