292
u/Plus-Weakness-2624 1d ago edited 1d ago
Squirt, you won't even be able to type that shit in if someone hasn't done that nano second optimization for the os you're running
433
u/Anaxamander57 1d ago
If you run that program every nanosecond then in one second you'll have saved a billion years. Think about that.
96
18
u/ZunoJ 12h ago
How did you come up with that number?
1s = 1.000.000.000ns
This means you save 100ns for 1.000.000.000 times. In total you save 100.000.000.000ns which is 100s
73
229
u/AzureBeornVT 1d ago
100 nanoseconds adds up with time
85
1
u/InvestingNerd2020 6h ago
And volume of requests or users. The extreme number of questions sent to Google search makes it worth it.
56
u/Cacoda1mon 1d ago
100 nanoseconds on application startup 👆
61
u/Madrawn 1d ago
I'm sure that it's some kind of flu all programmers sometimes get. A colleague recently was so lost in the sauce that he started to talk in big-O notation and finally successfully cut down a startup type-cache initialization from 4 to 2 seconds. After spending 2 days on it. For a customer project that was paid with a fixed sum. A backend service, that runs on an always-on app service.
163
u/Glum-Echo-4967 1d ago
Saving 100 ns can actually make a big difference.
In trading, prices can fluctuate rapidly. Just 1 millisecond can mean the difference between taking a profit and taking a loss.
And then (just spitballing here) there's online gaming. You want all consoles to agree on the sequence of events but to do this, they need to communicate with each other as quickly as possible; this is why you'll see PC gamers using Ethernet over a cable or fiber-optic Internet connection.
34
u/BlurredSight 1d ago
emmmmmm
So yes for HFTs it does matter because they make hundreds of thousands of dollars just playing bids/asks but even then physical distance to the exchange makes that difference too. But for gaming, yes ping and packet loss matters but only to a certain extent, you have the number of ticks per second the game server actually processes information and more importantly to create a fair environment netcode usually will round to about 60 ms for both parties
9
u/Glum-Echo-4967 1d ago
unless you're Nintendo, then there's not really a "game server" - a matchmaking server matches you up with a bunch of other players and then one of those players hosts the game.
3
2
u/purritolover69 4h ago
The only major peer to peer matchmaking games I can think of in 2025 are Destiny and GTA Online, both of which came out long ago which is why they’re peer to peer. It’s generally far more insecure than a server side game so the vast majority of online games now are server side
1
3
u/noahdaboss1234 13h ago
100 ns is to 1 ms as 53 minutes is to 1 year. Thats literally 4 orders of magnitude.
6
u/SilasTalbot 1d ago
Its more about algos that need to run billions of times to accomplish a task, vs running something really fast one time in isolation.
That being said, you might enjoy the book Flash Boys by Michael Lewis about the history of high frequency trading, and where it ended up as a parasitic disease the 2010s. Really breaks it down in easy to understand language and makes it entertaining, as Lewis does.
There's a great bit about a guy who was running his own fiber from New York to Chicago to be the fastest in capturing the arbitrage between futures markets (chi) and actual products (ny). He was out there in person on the side of the road during construction yelling at them every time they had to zig-zag around something. Even if they had to cross a road, he wanted it at 45 degrees vs 90, to minimize the total length.
Then a few years later someone else came along and used a chain of microwave towers to beat his speed.
0
u/Think-Corgi-4655 23h ago
Yeah and 100 ns is still only 0.0001 ms. And it'll still fluctuate with hardware
67
u/GoGoGadgetSphincter 1d ago
Everyone I've known who thinks performance isn't important inevitably writes something so awful that it shuts down production and causes a work stoppage at our company. Then they shift their focus from defending their poor coding practices to attacking the tech stack. Just say you're lazy and you don't care so we know that we have to load test the dirt simple SSRS report you built that doesn't generate more than 500 rows but somehow takes 40 minutes to run.
13
u/BlurredSight 1d ago
But here is the only little counter to this, because you are absolutely right for large applications like Salesforce, Google Workspace and Search, Microsoft Office/Teams, all of microcode development, but when someone is tasked with optimizing a program like Plex for example, spending 3 days for a 50 ns increase in processing media headers could've been spent on features customers might actually see benefits from
21
u/allarmed-grammer 1d ago
5G peak speed is 20 Gbps. It is 20 × 1 000 000 000 bps. 100 ns is 0. 000 000 1 s. 20 × 1 000 000 000 × 0. 000 000 1 = 2000 bits 100 ns is worth of 250 bytes in 5G data transmition, which could be used for 250 symbols in ASCII coding, just saying.
-2
u/noahdaboss1234 13h ago
An additional 250 bytes per 20 gigabytes is the equivalent of comparing 83 pixels to 10 hours of HD video, or adding a single sentence to an entire library of 20,000 books. Thats not gonna be worth the time it takes you to find and implement it.
2
u/allarmed-grammer 10h ago
Oh my sweet summer child
Before the start of the transmission, transmitter and receiver exchange with each other for several control messages. Lets take for an example a connection to 5G cell. There is a synchronization procedure that establishes connection of UE to 5G cell. RU (radio unit of 5G base unit) sends to UE (user equimpent, mobile phone with 5g capabilities) an PSS - primary synchronization signal. Then, UE responds with SSS - secondary synchronization signal. All just to adjust timings of incoming data transmission.
PSS and SSS each occupies 1 OFDM symbol with 127 subcarriers. Data modulation used in messeges is QPSK, quadrature phase shift keying, meaning each subcarrier encodes 2 bits of data. 127×2 = 254 bits which is almost 32 bytes. And if these 32 bytes are recieved in a wrong time frame - the whole transmitssion woun't start. Meaning no matter how much pixels in your video is, it woun't be transmitted at all.
And there are a lot of additional kinds for control messages that responsible for start and stop time frames, dynamic carrier spacing modification and so on. If they are missed during the proccess of ongoing transmission, that will fail it.
-1
u/noahdaboss1234 10h ago
And whats the allowable delay between all of those messages? Something tells me it makes that 100ns look negligible.
1
u/allarmed-grammer 10h ago edited 9h ago
It depends on subcarrier spacings. Typical used SCS in 5G are 15 kHz, 30 kHz, 60 kHz, 120, 240.
In case of 15kHz OFDM symbol duration is approximately 66.7 ns. So in 100 time frame there will be 1 OFDM symbol.
In 240 kHz OFDM symbol duration is something 4.17 ns. Meaning in 100 ns time frame there will be nearly 24 OFDM symbols.
Data size packed, or physical resource block PRB which consist of subcarriers, in OFDM symbol isn't constant aswell. And depends on chozen bandwidth.
And in OFDM symbol there is such a thing as Cyclic Prefix, which is something like "time buffers" for compensation. 15 kHz SCS is using ~4.7 ns for normal CP, 30 kHz SCS ~2.3 ns, 60 kHz SCS ~1.2 ns, 120 kHz SCS ~0.6 ns
0
u/noahdaboss1234 9h ago
Considering you specified 5g cell (and not wifi), and in 100ns light itself can only travel 100 feet, i call total BS on a latency between messages sent measured in nanoseconds. Unless there are secretly cell towers placed every 5 feet?
And none of that refutes my original point: 240 bytes is fuck all compared to 20gb.
1
u/allarmed-grammer 9h ago
Radio waves, on which 5G based, are a type of electromagnetic radiation and has another physical origin that differs from light's one, dunno why do you measuring all with the light speed.
It depends on what are these 240 bytes. If they are part of transport protocol - doesn't matter is it 1mb, 1gb, 20gb or a petabyte, the transmission will fail on their corruption.
0
u/noahdaboss1234 9h ago
"Are a type of electromagnetic radiation" bitch do you mean light? Because yeah thats what light is. Thats why i used light speed.
And i dont know if you realize how efficiency works, but adding 240 bytes to 20 gigabytes is a gain of 0.000001%.
1
u/allarmed-grammer 9h ago
Electro magnetic wave speed or what ever are you reffering to in antenna context is time delta between transmitter and receiver. Which will be applied by the receiver for its transmission window, it doesn't apply anything on how transmitter will fragment the outgoing wave on the time frames.
Funny to observe your aggressive ignorance regarding real time systems, but really be so kind and restrain yourself. There is no need to respond with slures if you are lacking some understanding.
10
u/___OldUser101 1d ago
Every CPU cycle counts.
1
u/noahdaboss1234 13h ago
Not always. It matters what percentage faster the code is compared to how long it takes to get that improvement. Is a 0.0001% increase in efficiency really worth the 24 hours of pay itll cost to pay a developer to spend 3 days finding and implementing that time save?
51
u/cheezballs 1d ago
100 ns per iteration over a million-element set? This meme fucking sucks. You suck.
8
15
u/Squeebee007 22h ago
If your app is processing millions of entries per hour and you can save 100 nanoseconds per entry, you’ll get a raise.
3
u/noahdaboss1234 13h ago
No, you wont. If you find a way to save 100 nanoseconds per entry, youd need to process atleast 100,000 operations per SECOND to gain 1% more efficiency. 5 million entries per hour is a saving of 0.5 seconds per hour, or 0.01%. In almost anything that wont be worth the time it took to find that time save.
10
u/srsNDavis 1d ago
That 100ns speedup can actually be significant, especially if it's a 100ns that grows e.g. with the input size, so your gains will add up at scale.
5
4
u/kasirate 21h ago
100ns can be the difference between a MOSFET exploding or not
5
u/TheJohnSB 21h ago
When i worked in the car industry I'd chase "cycles"(1/60s) of weld time to try and reduce our cell time. People would look at me like i was crazy but I'd just turn to them and say
"we do 10 welds on this part. If I can knock off even 1 cycle on a weld we could save one second every 6 parts. gives us enough time to produce an extra 10 parts an hour. Means you won't have to come in on overtime every weekend when shit goes wrong and takes the cell down"
Even just 0.02s is worth chasing.
6
3
u/NotMyGovernor 1d ago
I'm on a task where I have to speed up a c++ function by microseconds. Fun =)
2
2
u/thinkingperson 8h ago
Yeah, it matters 'cos C/C++ code are usually running much much more times at the lower level.
2
u/WavingNoBanners 7h ago
If you think this is self-congratulatory, wait until you see people boasting about SQL optimisation.
4
u/Forsaken-Scallion154 1d ago
And it only took 500 additional lines of code and a new injection vulnerability.
15
7
1
u/renrutal 23h ago
Funny that last week I did a refactor, adding an interface to a class, making the callers use that instead, and I had to prove did not change the performance of an application much.
It did increase the average evaluation latency by 150 nanos. Not super bad, but the p99 is under 12us, so over 1% worse.
Still peanuts compared to some network stack latency.
1
u/Athlaeos 22h ago
and then you post it online and some asshole goes and speeds up your program by a factor of several thousands
1
1
u/klippklar 22h ago
Yea but how many nanoseconds after we quadruple input / let it run a few thousand times?
1
u/masagrator 21h ago
Recently made 100k hashes table search faster 2x - from 400 to 200ns on average. 😂
1
1
u/Darxploit 19h ago
Last week i speed up a sales report feature of my college from 20min to under 2min. It turned out that they made unnecessary repeated database calls in a loop..
1
u/BA_lampman 19h ago
If that was a CPU operation per pixel on a 1080p image, you just saved ~200 milliseconds.
1
1
u/MatsSvensson 9h ago
As a PHP & Java -programmer, I frequently speed programs up by minutes.
Its just something I do.
But you don't see me bragging about it here.
No sir!
1
u/Terrorscream 1d ago
How many times is that programming running? That could be making a significant difference
-9
-3
1.5k
u/Skoparov 1d ago
I swear, this meme pops up every month here, every time the op is told that they're a dumbass and that 100ns is a pretty decent speed bump in certain areas. Then the cycle continues.