Hmm. I friend of mine claimed that requesting trade with another player in World of Warcraft crashed his router. When he needed to trade, he always insisted that the other player should initiate the transaction.
I was tempted to discard this as pure imagination, but this was a smart and knowledgeable fellow who worked as a programmer, so I'm assuming he had done some investigation before he came to this conclusion.
I know too little about networking equipment, but I can see how certain byte sequences (timed right?) would have some magic meaning. Given enough traffic you're likely to end up sending just such a sequence eventually. Perhaps certain versions of the WoW client spat out just such a magic sequence for the particular router he happened to use?
This won't work with the line coding of modern network protocols. The key part of this story is T1 using the older AMI coding that is susceptible to loss of sync from the right data pattern.
>An excel spreadsheet crashed this company's network.
>But it wasn't malware.
>The truth is much weirder.
>Try this out, open up a xls (not xlsx) file in your favorite text/hex editor. Notice all the repeating characters in the header.
>When receiving POP3 emails with an excel attachment, the characters bit patterns caused a signalling pattern on the physical copper of the company's T1 line, crashing the network equipment.
I read somewhere about a home router that corrupted packets in transit. In this case it was a torrent that never completed because of this. IIRC something in the nat engine bugged out and replaced bytes in the data and not just in the ip header.
More or less everything is broken. Its just that most protocols are designed to handle it.
Avoiding this issue would be a pleasant side effect of encryption. Since encrypted data is indistinguishable from noise, it wouldn’t matter if the underlying format has specific bit patterns that mimic a signal pattern.
It's not particularly uncommon to have a non-cryptographic whitening/scrambling step as part of high-speed signalling protocols (e.g. PCI-E) in part for this reason. Even for interconnects between different chips on the same PCB.
In a monkeys in front of a typewriter world, statistically, you are as likely to have a one off event that matches a specific bit pattern in the underlying format as you would the encrypted format. It would not be reproducible though since most encryption uses nonces
Low level signaling is a completely different work that we never even bother to think about these days - but it has its own idiosyncrasies and strangeness that has - mostly - been factored out.
Especially since modern encryption means that the same byte sequence on the wire will not be sent even if you do the exact same thing again. It covers quite a few sins.
For more context, read the comment[0] on Reddit's /r/sysadmin thread from the follow-up tweet[1]. It boils down to a certain bit-pattern in the spreadsheet's headers that corresponded to a signalling pattern in their T1 line, around the year 2003, causing the T1 equipment to fail.
Oh, it's entirely possible in principle (I don't know about the details of T1 in this specific case).
Many years ago, there used to be a vulnerability with a lot of modems where you could send a ping packet to a machine connected via dialup, and the machine would send back a response that its modem would interpret as telling it to hang up the call: https://seclists.org/bugtraq/1998/Sep/192
The OSI model is an abstraction that can be broken by implementation bugs or design flaws, not an immutable law of the universe.
EDIT: I see that some of the Twitter replies beat me to mentioning this.
OSI model makes it more possible. After all for each layer anything above it is just well bits... And if some layer is poorly implemented it can interpret certain run of bits as something else and act accordingly, but wrong.
There is lot of bad code specially when you have multiple implementations and all doing their own thing on their own level.
Hmm. I friend of mine claimed that requesting trade with another player in World of Warcraft crashed his router. When he needed to trade, he always insisted that the other player should initiate the transaction.
I was tempted to discard this as pure imagination, but this was a smart and knowledgeable fellow who worked as a programmer, so I'm assuming he had done some investigation before he came to this conclusion.
I know too little about networking equipment, but I can see how certain byte sequences (timed right?) would have some magic meaning. Given enough traffic you're likely to end up sending just such a sequence eventually. Perhaps certain versions of the WoW client spat out just such a magic sequence for the particular router he happened to use?
This won't work with the line coding of modern network protocols. The key part of this story is T1 using the older AMI coding that is susceptible to loss of sync from the right data pattern.
My last name contains the character sequence "rz".
Back in the BBS days this would trigger a Zmodem transfer on certain clients. It made a lot of people upset.
>An excel spreadsheet crashed this company's network.
>But it wasn't malware.
>The truth is much weirder.
>Try this out, open up a xls (not xlsx) file in your favorite text/hex editor. Notice all the repeating characters in the header.
>When receiving POP3 emails with an excel attachment, the characters bit patterns caused a signalling pattern on the physical copper of the company's T1 line, crashing the network equipment.
I read somewhere about a home router that corrupted packets in transit. In this case it was a torrent that never completed because of this. IIRC something in the nat engine bugged out and replaced bytes in the data and not just in the ip header.
More or less everything is broken. Its just that most protocols are designed to handle it.
Avoiding this issue would be a pleasant side effect of encryption. Since encrypted data is indistinguishable from noise, it wouldn’t matter if the underlying format has specific bit patterns that mimic a signal pattern.
It's not particularly uncommon to have a non-cryptographic whitening/scrambling step as part of high-speed signalling protocols (e.g. PCI-E) in part for this reason. Even for interconnects between different chips on the same PCB.
In a monkeys in front of a typewriter world, statistically, you are as likely to have a one off event that matches a specific bit pattern in the underlying format as you would the encrypted format. It would not be reproducible though since most encryption uses nonces
https://threadreaderapp.com/thread/1838256923428438345.html
In phreaking tradition I think it’s appropriate to call this the Excel Box.
So if you know the proper byte sequences (incantations) you can basically mess (do magic) with some routers.
Low level signaling is a completely different work that we never even bother to think about these days - but it has its own idiosyncrasies and strangeness that has - mostly - been factored out.
Especially since modern encryption means that the same byte sequence on the wire will not be sent even if you do the exact same thing again. It covers quite a few sins.
Unless it’s satire and I’ve missed the point. I'd have thought this wouldn't physically be possible because of the OSI model.
For more context, read the comment[0] on Reddit's /r/sysadmin thread from the follow-up tweet[1]. It boils down to a certain bit-pattern in the spreadsheet's headers that corresponded to a signalling pattern in their T1 line, around the year 2003, causing the T1 equipment to fail.
[0] (2018) https://old.reddit.com/r/sysadmin/comments/9si6r9/postmortem...
[1] https://twitter.com/lauriewired/status/1838257118685921694
Oh, it's entirely possible in principle (I don't know about the details of T1 in this specific case).
Many years ago, there used to be a vulnerability with a lot of modems where you could send a ping packet to a machine connected via dialup, and the machine would send back a response that its modem would interpret as telling it to hang up the call: https://seclists.org/bugtraq/1998/Sep/192
The OSI model is an abstraction that can be broken by implementation bugs or design flaws, not an immutable law of the universe.
EDIT: I see that some of the Twitter replies beat me to mentioning this.
OSI model makes it more possible. After all for each layer anything above it is just well bits... And if some layer is poorly implemented it can interpret certain run of bits as something else and act accordingly, but wrong.
There is lot of bad code specially when you have multiple implementations and all doing their own thing on their own level.
> JSDevOps says lower level bitstream-induced error is "fucking bollocks".
Surely, JSDevOps knows what he is talking about. I will rest peacefully with this in mind.
there's no need to be a jerk
clearly a troll, the username says it all