Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(Ars Technica)   Intel closing up ongoing security exploits in its upcoming Tiger Lake line. Just kidding, they're adding another layer of attack by baking in anti-malware detection instead   (arstechnica.com) divider line
    More: Asinine, Operating system, system security, X86, software exploits, Intel's SGX, veteran Windows security expert, implementation of CET, distinct difference  
•       •       •

481 clicks; posted to Geek » on 15 Jun 2020 at 6:59 PM (15 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



28 Comments     (+0 »)
 
View Voting Results: Smartest and Funniest
 
2020-06-15 6:14:29 PM  
We could make better software and pay engineers like we pay doctors, or we could add more shiat to the processor that will inevitably be circumvented or exploited themselves. Hmmmm.... HMMMMM.
 
2020-06-15 7:08:11 PM  
Fark user imageView Full Size


Here's your solution.
 
2020-06-15 7:09:46 PM  

Abe Vigoda's Ghost: [Fark user image image 400x142]

Here's your solution.


Yeah, AMD chips haven't had any major security exploits.
 
2020-06-15 7:31:31 PM  
I love reading about vulnerabilities that should never have existed in the first place.

Back in the late '80s, I was talking to a mainframe system programmer about how easy it would be to bypass a bunch RACF security that was baked into files, if you know the file contents.

He smiled and said, it won't work, He said he put in some code to verify access before any call was exeuted.
 
2020-06-15 7:32:07 PM  

Russ1642: Abe Vigoda's Ghost: [Fark user image image 400x142]

Here's your solution.

Yeah, AMD chips haven't had any major security exploits.


Yet.  They haven't had any major security exploits yet.

When they were a teeny tiny portion of the CPU market they weren't worthwhile to attack.  Now that their market share is ramping up (particularly in the data center) we'll see if that holds.
 
2020-06-15 8:01:40 PM  
Nothing is infallible, but at least this layer of protection does not seem to introduce new vulnerabilities or new methods of access that did not previously exist. So the worst case scenario is that it simply doesn't work, as opposed to accidentally making you more vulnerable.
 
2020-06-15 8:31:02 PM  
Relying on the CPU architecture for security was probably a mistake anyway, but here we are. I'd like to see operating systems that claim a high level of security - like OpenBSD - assume that the CPU architecture's security doesn't exist, so that if the CPU security fails, the kernel will still catch it.
 
2020-06-15 8:36:57 PM  

koder: We could make better software and pay engineers like we pay doctors, or we could add more shiat to the processor that will inevitably be circumvented or exploited themselves. Hmmmm.... HMMMMM.


Where do you live where developers and electrical engineers aren't paid enough to do a good job?
 
2020-06-15 8:54:27 PM  

OptionC: Russ1642: Abe Vigoda's Ghost: [Fark user image image 400x142]

Here's your solution.

Yeah, AMD chips haven't had any major security exploits.

Yet.  They haven't had any major security exploits yet.

When they were a teeny tiny portion of the CPU market they weren't worthwhile to attack.  Now that their market share is ramping up (particularly in the data center) we'll see if that holds.


No, they don't hold enough of a sway over the industry to take shortcuts and assumptions about how the code will look and respond.  So they write the code to cover all possibilities.  Intel can afford to have their computers make assumptions, and this allows some weird timing attacks to work (meltdown was far worse).

Supposedly there is an attack that works on just about everything AMD ever made (including bulldozer).  I was convinced it was a joke since it allowed on thread to tell what the other thread was doing based on cache access (and bulldozer includes separate L1 caches, so that *shouldn't* work).  But AMD also uses less bits in their MMU that you would assume they need (and this only bites them .01% of the time).  Presumably by carefully counting when they are bitten, you can tell what is going on in the other cache...
 
2020-06-15 9:01:46 PM  

SMB2811: koder: We could make better software and pay engineers like we pay doctors, or we could add more shiat to the processor that will inevitably be circumvented or exploited themselves. Hmmmm.... HMMMMM.

Where do you live where developers and electrical engineers aren't paid enough to do a good job?


Apparently everywhere, considering the processors we're shipping to everywhere have needed 3
revisions to fix basic buffer safety, which itself is hilarious because it's expecting people that can't figure out higer-level concepts like memory management or decent QA to appropriately plan for lower-level processor instructions.
 
2020-06-15 9:06:03 PM  
Got the list of malware from Kaspersky?
 
2020-06-15 9:16:45 PM  

koder: SMB2811: koder: We could make better software and pay engineers like we pay doctors, or we could add more shiat to the processor that will inevitably be circumvented or exploited themselves. Hmmmm.... HMMMMM.

Where do you live where developers and electrical engineers aren't paid enough to do a good job?

Apparently everywhere, considering the processors we're shipping to everywhere have needed 3
revisions to fix basic buffer safety, which itself is hilarious because it's expecting people that can't figure out higer-level concepts like memory management or decent QA to appropriately plan for lower-level processor instructions.


The problem isn't pay, they get paid plenty. These problems keep happening because no one cares. Products ship and sell with or without problems. They ship on time with, late without. They sell with new features, sit on the shelf if it's just safer.
 
2020-06-15 9:30:43 PM  

koder: We could make better software and pay engineers like we pay doctors, or we could add more shiat to the processor that will inevitably be circumvented or exploited themselves. Hmmmm.... HMMMMM.


Do you really believe engineers are themselves making the final decisions on anything?  Or the less qualified guy the engineers are doing their best to explain the options to?
 
2020-06-15 9:35:50 PM  

OptionC: Russ1642: Abe Vigoda's Ghost: [Fark user image image 400x142]

Here's your solution.

Yeah, AMD chips haven't had any major security exploits.

Yet.  They haven't had any major security exploits yet.

When they were a teeny tiny portion of the CPU market they weren't worthwhile to attack.  Now that their market share is ramping up (particularly in the data center) we'll see if that holds.


Talk about missing the joke. Yes, they've had major security exploits along with Intel.
 
2020-06-15 9:43:08 PM  

Crosma: Relying on the CPU architecture for security was probably a mistake anyway, but here we are. I'd like to see operating systems that claim a high level of security - like OpenBSD - assume that the CPU architecture's security doesn't exist, so that if the CPU security fails, the kernel will still catch it.


You need at least some hardware assistance if you want your programs to complete any time this side of eternity.  Otherwise, you're stuck inspecting each instruction before you run it, essentially operating a very slow virtual machine.

The big security mistake in modern architecture seems to be mixing data and code addresses in a single stack.  The fix outlined here appears to address that issue in an almost comically convoluted fashion, but one necessary in order to maintain code compatibility.  Of course the article is just an overview so I might be getting some of the details wrong.  I've long felt that we needed an architecture adjustment if we wanted to escape the perils of the stack, but nobody's been brave/foolhardy enough to do so.
 
2020-06-15 10:01:12 PM  
I don't think you should be baking computers.
 
2020-06-15 10:13:01 PM  

orneryredguy: Crosma: Relying on the CPU architecture for security was probably a mistake anyway, but here we are. I'd like to see operating systems that claim a high level of security - like OpenBSD - assume that the CPU architecture's security doesn't exist, so that if the CPU security fails, the kernel will still catch it.

You need at least some hardware assistance if you want your programs to complete any time this side of eternity.  Otherwise, you're stuck inspecting each instruction before you run it, essentially operating a very slow virtual machine.

The big security mistake in modern architecture seems to be mixing data and code addresses in a single stack.  The fix outlined here appears to address that issue in an almost comically convoluted fashion, but one necessary in order to maintain code compatibility.  Of course the article is just an overview so I might be getting some of the details wrong.  I've long felt that we needed an architecture adjustment if we wanted to escape the perils of the stack, but nobody's been brave/foolhardy enough to do so.


I think all they're doing is keeping a second stack which only the processor has access to at the hardware level, and copying every address they push on to the actual program stack to this second stack. When they pop the program stack, if it doesn't match what's in the second stack, they know corruption has occurred. It's nothing but hardware-level detection of stack overflows.
 
2020-06-15 10:32:51 PM  

gyruss: orneryredguy: Crosma: Relying on the CPU architecture for security was probably a mistake anyway, but here we are. I'd like to see operating systems that claim a high level of security - like OpenBSD - assume that the CPU architecture's security doesn't exist, so that if the CPU security fails, the kernel will still catch it.

You need at least some hardware assistance if you want your programs to complete any time this side of eternity.  Otherwise, you're stuck inspecting each instruction before you run it, essentially operating a very slow virtual machine.

The big security mistake in modern architecture seems to be mixing data and code addresses in a single stack.  The fix outlined here appears to address that issue in an almost comically convoluted fashion, but one necessary in order to maintain code compatibility.  Of course the article is just an overview so I might be getting some of the details wrong.  I've long felt that we needed an architecture adjustment if we wanted to escape the perils of the stack, but nobody's been brave/foolhardy enough to do so.

I think all they're doing is keeping a second stack which only the processor has access to at the hardware level, and copying every address they push on to the actual program stack to this second stack. When they pop the program stack, if it doesn't match what's in the second stack, they know corruption has occurred. It's nothing but hardware-level detection of stack overflows.


Yeah, that's what I get.  I can see that breaking some more esoteric DOS-era assembler code, but nothing legit coded in the last 20 years should mind.  But having a stack, and then having a super secret copy of the stack, means you're duplicating data, plus I'm curious where exactly they're keeping it and guaranteeing it's big enough.  RET calls are going to be more expensive, but that shouldn't be a big deal.  It would have been simpler (but broken just about all existing code) to have separate return address and argument stacks, and lock the hell out of the first one.
 
2020-06-15 10:40:27 PM  

koder: We could make better software and pay engineers like we pay doctors, or we could add more shiat to the processor that will inevitably be circumvented or exploited themselves. Hmmmm.... HMMMMM.


I mean... we *do* pay software engineers like we do doctors.

Not as a group, but it's not your average developer working on these types of things. Look at the compensation level for devs working on projects of this scale and it exceeds general practitioner docs, and they get better hours and working conditions too.
 
2020-06-15 10:40:49 PM  

SMB2811: koder: SMB2811: koder: We could make better software and pay engineers like we pay doctors, or we could add more shiat to the processor that will inevitably be circumvented or exploited themselves. Hmmmm.... HMMMMM.

Where do you live where developers and electrical engineers aren't paid enough to do a good job?

Apparently everywhere, considering the processors we're shipping to everywhere have needed 3
revisions to fix basic buffer safety, which itself is hilarious because it's expecting people that can't figure out higer-level concepts like memory management or decent QA to appropriately plan for lower-level processor instructions.

The problem isn't pay, they get paid plenty. These problems keep happening because no one cares. Products ship and sell with or without problems. They ship on time with, late without. They sell with new features, sit on the shelf if it's just safer.


This. It's not about pay. It's about time. I don't have time for security, unit tests, integration tests, code reviews, or regular maintenance.

The worst thing I can do, in the company's mind, is to not be working on something new. Unless things break, then it's "Why haven't you fixed this? Oh yeah, and since I have you, where's my new thing?"
 
2020-06-15 11:23:38 PM  

Mr Tarantula: SMB2811: koder: SMB2811: koder: We could make better software and pay engineers like we pay doctors, or we could add more shiat to the processor that will inevitably be circumvented or exploited themselves. Hmmmm.... HMMMMM.

Where do you live where developers and electrical engineers aren't paid enough to do a good job?

Apparently everywhere, considering the processors we're shipping to everywhere have needed 3
revisions to fix basic buffer safety, which itself is hilarious because it's expecting people that can't figure out higer-level concepts like memory management or decent QA to appropriately plan for lower-level processor instructions.

The problem isn't pay, they get paid plenty. These problems keep happening because no one cares. Products ship and sell with or without problems. They ship on time with, late without. They sell with new features, sit on the shelf if it's just safer.

This. It's not about pay. It's about time. I don't have time for security, unit tests, integration tests, code reviews, or regular maintenance.

The worst thing I can do, in the company's mind, is to not be working on something new. Unless things break, then it's "Why haven't you fixed this? Oh yeah, and since I have you, where's my new thing?"


Yup.

Making bug-free software takes a LONG time. So long, in fact, that if you try to do it, ALL of your competitors will have eaten your market by the time you actually ship a product.

The only thing that will change this is if there were some kind of taxpayer-funded "software verification" organization that could slap their sticker on software. Something like the FDA, I suppose.

The problem with that, though, is that we just don't have the technology to actually do it in any kind of timely manner. It would make publishing software as expensive and time-consuming as introducing a new drug. Years of development, years of testing, years of MORE testing by the equivalent of the FDA...

We just aren't ready for that. And, it would *destroy* the software market. Only the richest players would have the funds to get their software to market.

Not that something shouldn't be done about security issues in software. I just don't know what it would be. Software is just too complicated to evaluate these days.
 
2020-06-16 1:06:53 AM  

Russ1642: OptionC: Russ1642: Abe Vigoda's Ghost: [Fark user image image 400x142]

Here's your solution.

Yeah, AMD chips haven't had any major security exploits.

Yet.  They haven't had any major security exploits yet.

When they were a teeny tiny portion of the CPU market they weren't worthwhile to attack.  Now that their market share is ramping up (particularly in the data center) we'll see if that holds.

Talk about missing the joke. Yes, they've had major security exploits along with Intel.


The very reason I funny'd it.

Security has always been a cat and mouse game. Probably always will be.
 
2020-06-16 8:12:12 AM  
I miss the days when computers weren't so connected and if you mucked up the machine you just power cycled it and it came up fine from the ROM OS faster than a Windows 10 with the latest and greatest hardware today.
 
2020-06-16 9:41:41 AM  

Russ1642: OptionC: Russ1642: Abe Vigoda's Ghost: [Fark user image image 400x142]

Here's your solution.

Yeah, AMD chips haven't had any major security exploits.

Yet.  They haven't had any major security exploits yet.

When they were a teeny tiny portion of the CPU market they weren't worthwhile to attack.  Now that their market share is ramping up (particularly in the data center) we'll see if that holds.

Talk about missing the joke. Yes, they've had major security exploits along with Intel.


That's pretty misleading to say it like that.  If you look at the exploits from say - the last 10 years - you'll see a clear pattern.  AMD's products where a lot less vulnerable.  Some of the exploits where less severe on AMD, and some clearly didn't affect them at all.  Some exploits where assumed to also affect them (by some people), but without any working proof-of-concept code thus likely they where not actually vulnerable IRL.  On the other hand I cannot recall even one exploit that only affected AMD, but not Intel.  Nor any exploits in common that where more severe on AMD.  No matter how you slice it, the vulnerability list for AMD is much shorter than Intel.

One major difference was in how they implemented speculative execution that gave Intel a speed advantage at the expense of security.  So lets not make any false equivalence comparing AMD with Intel.  And now Intel doesn't even have the speed advantage.

OptionC: When they were a teeny tiny portion of the CPU market they weren't worthwhile to attack.


From the perspective of malware author who has a financial incentive to infect as many machines as possible, that may be true.  That's far less so a primary motivation of security researchers - aka the people who have been discovering and disclosing these vulnerabilities.  Of course if AMD left any low handing fruit, a malware author would still bite.

Lets not forget state actors.  They would be interested in vulnerabilities for high and low market share products, and certainly have the resources to effectively search for both.
 
2020-06-16 10:53:12 AM  

bk3k: Russ1642: OptionC: Russ1642: Abe Vigoda's Ghost: [Fark user image image 400x142]

Here's your solution.

Yeah, AMD chips haven't had any major security exploits.

Yet.  They haven't had any major security exploits yet.

When they were a teeny tiny portion of the CPU market they weren't worthwhile to attack.  Now that their market share is ramping up (particularly in the data center) we'll see if that holds.

Talk about missing the joke. Yes, they've had major security exploits along with Intel.

That's pretty misleading to say it like that.  If you look at the exploits from say - the last 10 years - you'll see a clear pattern.  AMD's products where a lot less vulnerable.  Some of the exploits where less severe on AMD, and some clearly didn't affect them at all.  Some exploits where assumed to also affect them (by some people), but without any working proof-of-concept code thus likely they where not actually vulnerable IRL.  On the other hand I cannot recall even one exploit that only affected AMD, but not Intel.  Nor any exploits in common that where more severe on AMD.  No matter how you slice it, the vulnerability list for AMD is much shorter than Intel.

One major difference was in how they implemented speculative execution that gave Intel a speed advantage at the expense of security.  So lets not make any false equivalence comparing AMD with Intel.  And now Intel doesn't even have the speed advantage.

OptionC: When they were a teeny tiny portion of the CPU market they weren't worthwhile to attack.

From the perspective of malware author who has a financial incentive to infect as many machines as possible, that may be true.  That's far less so a primary motivation of security researchers - aka the people who have been discovering and disclosing these vulnerabilities.  Of course if AMD left any low handing fruit, a malware author would still bite.

Lets not forget state actors.  They would be interested in vulnerabilities for high and low market share products, a ...


If AMD chips were under anywhere near the same scrutiny as Intel's chips they'd find more flaws. That's obvious. AMD doesn't have some magic recipe for security that Intel just can't figure out. They're using the same damn architecture FFS.
 
2020-06-16 11:21:03 AM  

Russ1642: If AMD chips were under anywhere near the same scrutiny as Intel's chips they'd find more flaws. That's obvious. AMD doesn't have some magic recipe for security that Intel just can't figure out. They're using the same damn architecture FFS.


Try reading the post you quoted again.  They are more or less under the same scrutiny.  While criminal actors may be more motivated to go after Intel, they aren't the group most likely to find these sort of vulnerabilities first.  Actual researchers and state actors are.  Those two groups are not more motivated to focus solely on Intel.  Despite this, we have seen more Intel-only vulnerabilities, no AMD-only vulnerabilities, and some Intel-more vulnerable with regards to attacks that affect both.

To underline that point, how many of these CPU vulnerabilities from the last decade or so have been discovered first by malware authors?  Nothing comes to mind.  And that's the group who will care far more about market share, since they have a strong financial incentive to find and exploit vulnerabilities.  They are obviously more successful on the software front, and even then often rely on systems that aren't receiving updates.  Attacking already known vulnerabilities.

Speaking of architecture doesn't help your case either.  I can't recall any of the vulnerabilities being inherent in the core x86(and extensions) architecture.  The issues would all be in common where that the case, but that isn't the case.
 
2020-06-16 11:30:43 AM  

bk3k: Russ1642: If AMD chips were under anywhere near the same scrutiny as Intel's chips they'd find more flaws. That's obvious. AMD doesn't have some magic recipe for security that Intel just can't figure out. They're using the same damn architecture FFS.

Try reading the post you quoted again.  They are more or less under the same scrutiny.  While criminal actors may be more motivated to go after Intel, they aren't the group most likely to find these sort of vulnerabilities first.  Actual researchers and state actors are.  Those two groups are not more motivated to focus solely on Intel.  Despite this, we have seen more Intel-only vulnerabilities, no AMD-only vulnerabilities, and some Intel-more vulnerable with regards to attacks that affect both.

To underline that point, how many of these CPU vulnerabilities from the last decade or so have been discovered first by malware authors?  Nothing comes to mind.  And that's the group who will care far more about market share, since they have a strong financial incentive to find and exploit vulnerabilities.  They are obviously more successful on the software front, and even then often rely on systems that aren't receiving updates.  Attacking already known vulnerabilities.

Speaking of architecture doesn't help your case either.  I can't recall any of the vulnerabilities being inherent in the core x86(and extensions) architecture.  The issues would all be in common where that the case, but that isn't the case.


Yeah, I replied to the wrong post. And yes, some of the vulnerabilities have been in common on both sides (Spectre comes to mind). Just because you're unaware of them doesn't make them non-existent. Also, I don't believe for a second that AMD's chips are under the same scrutiny as Intel's.
 
2020-06-16 10:21:25 PM  

Russ1642: bk3k: Russ1642: If AMD chips were under anywhere near the same scrutiny as Intel's chips they'd find more flaws. That's obvious. AMD doesn't have some magic recipe for security that Intel just can't figure out. They're using the same damn architecture FFS.

Try reading the post you quoted again.  They are more or less under the same scrutiny.  While criminal actors may be more motivated to go after Intel, they aren't the group most likely to find these sort of vulnerabilities first.  Actual researchers and state actors are.  Those two groups are not more motivated to focus solely on Intel.  Despite this, we have seen more Intel-only vulnerabilities, no AMD-only vulnerabilities, and some Intel-more vulnerable with regards to attacks that affect both.

To underline that point, how many of these CPU vulnerabilities from the last decade or so have been discovered first by malware authors?  Nothing comes to mind.  And that's the group who will care far more about market share, since they have a strong financial incentive to find and exploit vulnerabilities.  They are obviously more successful on the software front, and even then often rely on systems that aren't receiving updates.  Attacking already known vulnerabilities.

Speaking of architecture doesn't help your case either.  I can't recall any of the vulnerabilities being inherent in the core x86(and extensions) architecture.  The issues would all be in common where that the case, but that isn't the case.

Yeah, I replied to the wrong post. And yes, some of the vulnerabilities have been in common on both sides (Spectre comes to mind). Just because you're unaware of them doesn't make them non-existent.


Still having trouble reading?  Again... in the post your quoted I mention that some attacks affect both - which is the second time I mentioned this.  Stop trying to swat down straw men.  Just stop.
Also, I don't believe for a second that AMD's chips are under the same scrutiny as Intel's.

And I literally just explained exactly why you are completely mistaken.  Who has a reason to focus on Intel?

Malware authors - they have that motivation due to market share, but they're not really the ones getting it done with respect to hardware level vulnerabilities.  Again show me those examples of these guys finding hardware vulnerabilities first.  I doubt you will have much, if anything to point out.
Researchers - they do not have that motivation.  They're researchers and they aren't getting paid per infection.  They aren't ransoming data, stealing industrial secrets, etc.  To them it is just as good to find an AMD vulnerability as an Intel one.  Perhaps even better, since they'd be finding something more relatively rare thus earning them more attention.
State actors - they do not have that motivation because they are well resourced enough to do "all of the above".  They aren't interested in leaving holes in their capabilities, thus they will not target ONLY Intel.
 
Displayed 28 of 28 comments

View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter



  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.