If you’re faced with the tradeoff between security and another priority, your answer is clear: Do security. In some cases, this will mean prioritizing security above other things we do, such as releasing new features or providing ongoing support for legacy systems.
I respect this. I’d be very happy if my boss told me this and I would feel empowered to build great products. I hope this sentiment spreads through the industry.
Eh, my boss formally tells me this, too, but then the finances never allow for security anyways. It’s easy to state something like this towards journalists and then never get it down in practice.
I’d be curious to see if this actually enforced and for how long. I see companies cutting costs on security all the time. You can’t really trust them with anything else than creating and optimizing processes to make money. I’d rather see public regulators eat their turnover until they comply.
Sounds like they’ve been following this well… Except replace “security” with AI.
It is. Currently I don’t have a machine where I can both install and test code.
NCIS found the best solution to security years ago.
Here is an alternative Piped link(s):
https://piped.video/bwUdjeu4C6A?si=7CzrM537rbadLMns
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Incentives like this are tricky. You can reduce the numbers by fixing the problem, or by sweeping it all under the rug. Guess which is easier to do on a quarterly basis?
This is a tough bar. Security often cannot be prioritized alone. You have to have solid architecture and fix bugs because any bug can have potential security impacts. Your code has to be not garbage.
Which is exactly why security should be on the executive agenda.
Tough but necessary. Irrefutably necessary.
Technology has evolved at faster than we’ve been able to secure it and now we’re paying the price with enterprise and state level breaches, and global annual internet fraud at an all time high.
And not just software but physical goods too. We’ve produced without any consideration for end of product life cycle management and now we’re in a plastic crisis.
Completely different spheres of society but so similar in so many ways.
Judging by the last month of our Microsoft 365 tenant at work, they have plenty of room to improve. (Maybe by expanding in-house QA instead of relying on their customers.)
One of the several issues we ran into in the last few weeks was that you couldn’t download or view attachments in the Outlook Web app if you’d been logged in for over 10ish minutes.According to the official advisory, this was due to “code put in production designed to increase reliability.” That was a funny way of making things reliable. It was over a week until they’d pushed a fix for that one - right around the time more Outlook issues started popping up.
So yeah, while I agree with you that this might be tough - it might just be the best move they’ve made in a while. Maybe it’ll cause them to pay more attention to fixing bugs, and focus less on solving problems no one has. (Apparently we, as customers, have been dying for an AI button on our keyboard, to easily access an AI feature now baked into the taskbar.)
And in Microsoft’s case you also have to preserve backwards compatibility. It’s one of the reasons the OS continues to dominate despite how it treats its users.
…which often stands at odds with actual security.
They tend to make breaking changes every other release, which is always the release that people hate. (Granted, I don’t know wtf they’ve done with usability in Windows 11, but at least I can’t move the taskbar anymore.)
Well, that’s a breaking change for usability. I’m talking e.g. not allowing any random process to access the clipboard.
Well, going from ‘hot garbage’ to ‘not garbage’, they have a long road ahead.
“Not garbage” seems like a low bar to overcome for a company with such long experience. 😅
Long experience of producing garbage code…
What makes it garbage code? I mean, I don’t like Windows due to the user experience, but I have zero insight into the code itself because it’s proprietary closed-source and I’ve never worked at Microsoft.
I mean, there is actually leaked source code of Windows XP out there, because, you guessed it, they had a leak of that, too.
But I actually said “garbage code”, because I didn’t want to say that everything they’ve ever done is purely garbage. I didn’t want to claim that I have particular insight into specifically their code.
I have to assume, though, that their code quality is garbage, because:
-
Lots of MS software is buggy. In particular, all those security issues are bugs, too.
-
They keep backwards-compatibility to just absurd degrees. To this day, you can’t create a file that’s called “aux”, for example, because at some point, they had to block that to retrofit filesystem support into their OS.
At the very least, this is going to mean they’ll have tons of such workarounds and gotchas, which will make it difficult for new devs, but also offer more surface area for bugs/vulnerabilities. -
Well, and then there’s some urban legends. For example, I’ve heard that the entirety of Windows is in one giant monorepo. I just quickly peaked into a supposed copy of the Windows XP leak and that did look the part…
All software is buggy 😅
But yeah, keeping backwards compatibility does tend to open a lot of bug surfaces, like you say. Though IMO that’s due to the decision to do so, rather than the code itself. I’m sure they do their best with the corporate decisions to which they have to adhere. But you probably didn’t mean they are bad coders, merely that the end product becomes buggy, I suppose. 😊
-
Yet here we are… 🙄
But we just bought tool X that is ISO certified AND soc2. How are we not secure yet? Does the tool not work?!?
So they are changing team’s KPIs to allow for this right? If I was an employee I’d also be fearing that it is going to become impossible to do anything because they won’t have the access to systems to do their job.
That kind of irrational fear of implementing good security is a big part of how bad security happens which leads to breaches.
Doing your work securely should be the norm. Each person should have the least privileged access they need to do what they need.
The problem is that if you implement security that is too strict, then employees will find ways around it that are even worse than the more permissive method. I don’t disagree that people should have the minimum access required to do their job, but if it isn’t proprietary then the controls should be relaxed, and if someone requests access to something it needs to be responded to immediately so they are not delayed in whatever they were trying to do.
Seems best to do this after firing the first 2-3 levels of leadership since this whole mess was created under their watch. Maybe the next thing to do is to ask if the US government wants to so heavily depend on a company that is no longer a US entity.
Microsoft is overwhelmingly Indian contractors now. Infact much of the large legacy US tech companies have done so much offshoring I’d hardly call them US companies anymore. Are these companies really who we want to stake our national security on?
I wonder if this will actually cause an increase in the number of security vulnerabilities and breaches as there’s now a fairly obvious way for employees to penalize their bosses financially for being assholes…
They fired their testers long ago who might’ve caught that. So ya. I can totally see that happening
That exactly it. M$ execs look at this stat and probably go “we need to make it more unsecure, for the shareholders - of course.”
Security and Stability please.