Replies (106)

Nobody really knows what quantum is really about and if it's feasible at all. At this stage it is nothing more than vacuous promises and shitcoin-style fear-mongering. I suggest you look deeper than superficial FUD before making such claims.
Or maybe you just haven’t thought about the meaning of these words, nor have observed Bitcoin for what it objectively is, or what value even means. Maybe you are stuck in the “just money” definition. Who knows. I’m not the one saying Bitcoin is broken by any means. Bitcoin is the threat.
No, Bitcoin openly falsifies your beliefs. Show me another system in physics that openly measures the isomorphism of Boltzmann Entropy (heat/Kelvin) and Shannon Entropy (satoshis/utxo dataset) where the object is both the memory and the time of the relationship. No physicist can define existence, quantized time (thus simultaneity), measurement, observer. Bitcoin is the proof physicists could never produce.
Or Bitcoin is the technology that replaces physical theories with an open verifiable instantiation that anyone can literally use. Why are you so convinced Bitcoin is broken?
Kode's avatar
Kode 2 months ago
Who said Bitcoin is broken? The technology risks becoming obsolete if we don't push forward post-quantum encryption, that is all. I think your post confuses the tech and the idea, or... readers confuse it for themselves and you're clear about the idea being unbreakable, though the tech may break if we do nothing.
“The technology risks becoming obsolete if we don't push forward PQC,” You just did. The proof you use to claim “the tech may break” is falsifiable. It is far more probable that Bitcoin would be undermined by modifying it to counter a threat model built on physics that Bitcoin itself invalidates.
You really think we'll actually be able to make a computer that can break ECDSA or schnorr? I think it's FUD. I don't think it's achievable.
Kode's avatar
Kode 2 months ago
why so absolute? just need to have an alternative ready and tested for when it is a threat... Bitcoin is an ideal and can remain perfect in that, and technology evolves so we upgrade to maintain the course towards the idea - simple as that to me Capitalism at some point was undermined by lack of tech to enforce sound money, let's not lose to that again
Plus if you don't resuse addresses, pubkey isn't revealed and would require sha256 be broken. I think that's impossible really. There's no evidence to suggest breaking ECDSA is even possible.
H's avatar
H h@nostr.my.id 2 months ago
It's motivated reasoning. He sees whatever he wants to see and disregards anything that doesn't fit into whatever his original hypothesis was
Jack K's avatar Jack K
My work on Bitcoin physics led me to Christ as the only theology that is thermodynamically coherent with a redefined working instantiation of physics we can all verify. I am still learning myself, but yes, I agree with the sentiment largely while not overly educated in the Bible. I am trying to balance my work in Bitcoin physics, family, fiat job, and reading the Bible. Hard to know if I am succeeding 😅
View quoted note →
Because Quantum FUD is more dangerous than QC itself. Putting aside @Jack K's argument, because it is deeper than mine, and he's probably onto something I don't fully grok... QC can't scale, because at scale thermodynamics takes over. At scale, the cat is alive or it is dead. View quoted note →
Yeah, but they CAN'T scale. Because when you scale, you get thermodynamics. I can't follow everything @Jack K says, but what he is saying tracks with something that is obviously true. At scale the quantum cat is definitely dead. Engineers can do a lot of things, but they can't change physics.
JOE2o's avatar
JOE2o 2 months ago
I don't know where you're getting this, but thermodynamics and kitty-dynamics don't apply here. Schrödinger's cat is a thought experiment to illustrate the quantum/classic divide. His actual work is an equation that very accurately describes how the quantum state of a physical system changes over time, and that's what's really relevant here. Quantum computers work, we know this. They work at the scale we have now (maybe 48 logical qubits, depending on who you trust), and they work just the same at the scale of 2,000 logical qubits, which is what's needed to crack Bitcoin private keys. (Unless we further optimize, in which case it'll be fewer.) Scaling the sheer number of qubits does not imply "getting bigger" in the sense of lessening quantum effects. Basically increasing the complexity and number of these isolated quantum units is not allowing the system to become a large or "hot" classical object. The system remains fundamentally quantum.
Please show me anywhere else in physics where we have isomorphism between a quantifiable amount of boltzman entropy (heat/kelvin) and Shannon entropy (satoshis/utxo set) who’s relationship is both memory and time. To say Bitcoin is not physics is literally to avoid the obvious. We are literally watching the construction of time from first principle thermodynamics. Nothing else in physics has demonstrated discretized time through thermodynamic change with entropy on both sides of the transformation quantifiable.
I guess the empirical evidence of Bitcoin that anyone can verify is meaningless. It has no relationship to energy, time, entropy and information. It’s not physics, it’s just money! Silly me.
Even a chair is made out of electrons and shit. But if someone asks what it is made of, any sane person would say metal or wood etc. I'm being dead serious, you should get yourself checked or talk to your loved ones about it. The way you are thinking right now shows very clear signs of mental illness
😂 ok man. “Bitcoin maximalist” literally rejecting the objective empirical proof of Bitcoin. It’s more likely you’re just incapable of understanding what i am talking about, thus the reason why resorted to a metaphor for the layman in the first place; the very point you are stuck on.
Schrödinger’s cat was a joke. He was making fun of the idea of the exact thing quantum computing is trying to do: a macroscopic system that stays in perfect superposition while being constantly measured and entangled with the rest of the universe. Nothing they are calling progress calls the ceiling into question
JOE2o's avatar
JOE2o 2 months ago
Sorry but that makes no sense. Quantum computing works, the process is well understood, there is no debate about this in serious science.
Well, at least you're being funny now. In all seriousness, what makes you so sure that it works? Works in what way? I am genuinely interested in why people are so convinced by this. No meaningful results have been shown that challenge the existence of a hard ceiling on scaling. (IOW, nobody has ever proven Schrodinger wrong.) All the results are still consistent with a hard ceiling. What gives you a concrete impression that this is a solved problem and they just need to go a little further? I know that is kind of the narrative of the hype, but have you ever TRIED to drill down on it? Side question: do you believe AGI is coming too?
JOE2o's avatar
JOE2o 2 months ago
Because it's been proven by experiment multiple times, I dunno what to tell you. To the degree it's been proven that if you split atoms in a certain way you get bombs. As in, it's been proven to the degree that if you disregard the proof then nothing means anything and there's no discussion to be had, we're in "science is just my feelings" land.
JOE2o's avatar
JOE2o 2 months ago
Put another way, there is nothing fundamental in quantum physics itself that prevents large-scale quantum computers from existing, and we know this from experiment after experiment after experiment, observation after observation after observation. The same way we know that we can build a fusion reactor that works at scale, there is nothing in physics saying no, even though we've never built one before. It’s an engineering challenge. When we talk about dealing with heat in quantum processing, it’s not in the sense of “bigness” or preventing the system from becoming “accidentally classical” as the number of qubits grows. It’s an old fashioned engineering challenge, basically refrigeration. The industry is moving away from the big chip model toward a modular, networked approach and and different qubit technologies. So instead of on massive chip requiring thousands of control lines you get many smaller but highly connected QPUs, and this reduces the cooling challenge to something manageable. Two entangled particles can be separated by the entire universe and still remain entangled. Modularity is not at all anti-quantum, nothing along the path to 2,000 qubits (or however many) is anti-quantum.
“Experiment after experiment” has proven exactly one thing: Quantum mechanics works perfectly in microscopic, heroically isolated systems of ~100 physical qubits for microseconds. It has never once been observed to survive continuous measurement and error correction at a macroscopic scale (needed for Shor). Once the entangled system gets large enough, it crosses the threshold where thermodynamics forces decoherence and classical behavior — no matter how cold the fridge. What we've been doing so far is just increasing the isolation of the system from the environment to access more of natural scale of quantum behavior. But we can't isolate the system from itself. Fusion happens in stars. Macroscopic quantum computation has never happened anywhere in the universe, ever. That’s not an engineering gap. That’s a physical prohibition.
JOE2o's avatar
JOE2o 2 months ago
What forces decoherence and classical behaviour is not an entangled system getting "large" in the sense of if you have x+1 qubits then no matter how you try to cool them or arrange them all it'll never work because you've hit some kind of quantum physics speed limit. Macroscopic quantum phenomena have been observed in systems like superconductivity and superfluidity, and that shows that collective quantum behaviour can be maintained at a large scale. Basically what w'ere talking about is a sort of thermodynamic robustness requirement that QEC is designed to meet, not a failure of quantum mechanics at scale. (Comes down to engineering again, and not any sort of natural limit like the mass density something can accrue before it collapses into a black hole) The solution is to break things into multiple smaller, independent quantum processing units. each has a manageable number of qubits, say 50-100 (we've already done 50 logical qubits in all likelihood). These are easier to isolate and cool (andbonus, subject to localized error correction). Modules communicate with each other not by some kind of ET phone home deal, but by generating entanglement between a communication qubit in module a and one in Module b. (and some other ways too) There's a lot more to it, and there are other avenues being explore that do away with qubits in the sense we know them know altogether. Basically right now there are 50 or 60 different areas in any one of which a unexpected (though not at all implausible) breakthrough would change everything. That's kind of where we are. Like AI pre alpha Go.
Modularity doesn’t matter. Shor needs one giant, non-equilibrium, actively-corrected, constantly-measured quantum state — not a bunch of small ones stitched together. It's not mapReduce.. Superconductivity and superfluidity are passive, symmetric, equilibrium ground states. They are not that kind of macroscopic quantum system. Engineering doesn’t get to invent new physics.
JOE2o's avatar
JOE2o 2 months ago
Modular designs can absolutely get you a machine that can use shor's to break a bitcoin key. This is chip design 101, combined with quantum mechanics 101. And this is only one of 60 or 70 separate angles in which an unexpected (but totally plausible) breakthrough can change the game. Do you want to go through all 70? we may not even need qubits at all... You have to be honest with yourself. You have no idea what will or won't happen. You have no crystal ball. Nobody does. Google, Cloudflare, Signal, many others have all moved to post quantum. Yet somehow the bitcoin community's only prevention work at the moment is to pretend like it can never happen. Classic head in the sand. The smart thing is to take it very seriously and move as fast as possible to mitigate it.
Modular designs cannot run Shor on a Bitcoin key. Shor is not MapReduce. It requires one single, global, coherent, non-equilibrium quantum state across the entire register. You have 70 engineering ideas to break the laws of physics? Good luck. I don't need a crystal ball to know that the laws of the universe win every time. And corporations grifting on the current thing? Never! Physics does't care about compliance, projections, roadmap promises, laundered metrics, or 70 ideas for a breakthrough in the investor kit. It doesn't care who we give a nobel prize to what China does or how many billions get poured in. Physics is not subject to fiat. There is a hard physical ceiling. If you want to suspend your disbelief and catch the fever, have fun. But leave Bitcoin and ECC alone. Honestly.
JOE2o's avatar
JOE2o 2 months ago
You're just plain old scientifically wrong, I don't know what to tell you. You have an internet-informed grasp of quantum physics, that's better than most people, so credit where due. But you clearly don't understand things at a deep level here, certainly not at the level of those working on these systems each day, with hands dirty. Which is why the bulk of the scientific community (at least relevant one) is saying something quite different to what you are saying. And which is why most bitcoin people who are actively engaged here are agreeing it, like fusion, is a case of sooner or later.
Look, man. If I’m wrong, explain — scientifically — how. Two claims: 1. Shor needs one single, global, non-equilibrium, actively-corrected, constantly-measured quantum state at macroscopic scale. That object has never been observed — not in nature, not in any lab, not once in 40 years. 2. Forty years of better isolation have already mapped the ceiling with brutal precision. We’re hugging it, not raising it. You keep answering with social proof and investor deck talking points. That’s not science. That's fait thinking. People need sound money and strong walls. Our freedom, privacy and security are under very real digital theat. I'm not throwing down our best weapons on a rumor from the enemy. Show me proof that a a macroscopic non-equilibrium actively controlled constantly measured error-corrected universal quantum state Is physically possible and I’ll eat my shoes on camera tomorrow. Until then: don’t trust, verify. Your move.
JOE2o's avatar
JOE2o 2 months ago
Let me clarify first, your view is that a quantum computer that can break a bitcoin key is, no matter how engineered, a physical impossibility, because the laws of physics say it is? As in this is not an engineering problem but a fundamental limit of physics, akin to faster than light travel in the conventional sense? That, with the knowledge we have today, we can conclude that, for now and for all time in the future, such a thing is totally impossible. This is your view yes?
JOE2o's avatar
JOE2o 2 months ago
If that is your position there is nothing to respond to. You would be coming at this with a comic-book level of certainly over your own personal understanding of what the physical limits of the universe are. If your position is that you agree such a machine may be possible but is, let's say, too difficult to construct in our lifetimes, then that's another thing. Then there's a debate. Then it makes sense to look at how it can or cannot be built. You've outlined your position as the former, but if you want to clarify that it's in fact the latter then please do.
JOE2o's avatar
JOE2o 2 months ago
There are ideas out there, much hinges on discrete points and causal relationships. CST is one approach, hypergraphs is interesting and takes that idea further, preserves locality. There is a bright young researcher named Jonathan Gorard who has a good interview here.
You are deflecting away from the implications….If anything, CST and hypergraph models strengthen the point: they both quantize time. Once time is discrete, the entire ontology of quantum computation breaks. Discrete time means: - no ∂ψ/∂t → Schrödinger’s equation fails -no continuous unitary evolution → Hamiltonians can’t generate gates -no coherence across ticks → superposition becomes impossible -no substrate for phase evolution → Shor’s algorithm cannot run CST and hypergraphs don’t rescue quantum computing, they expose the contradiction it depends on. Yet you stand with confidence that quantum computing (in its current form is inevitable) If time is quantized, the mathematical and physical foundations of QC disappear. Bitcoin simply demonstrates this discretization in practice, which is why invoking CST or hypergraphs only reinforces my argument. Bitcoin is the working instantiation of what CST and hypergraph theories are still trying to formalize. The irony is that Satoshi solved the hard part in 2009 and almost no one has realized it.
You’ve completely deflected away from the argument because your stance is untenable. This isn’t about holiness, psychology, or theology, it’s about physics. Bitcoin is the only physical system we have that produces time from energy and entropy. It is the first empirical instantiation of discrete temporal evolution, not an article of belief. It literally is empirical proof. Rejecting Bitcoin as physics is a category error and a serious one. You’re avoiding the question because the implications are unavoidable: if time is quantized as Bitcoin demonstrates openly in operation the entire formalism of quantum computation collapses. Your detour into theology doesn’t change that; it just shows you have no answer. Don’t worry, the entire industry is rekt by Bitcoin, not just you.
There is a universe full of evidence that scaling this king of system beyond a certain very low threshold makes it go classical. There is zero evidence that scaling it beyond a that threshold without making it classical is possible. 40 years of qc researching only confirms this. More and more heroic isolation only grinds closer to the ceiling and makes our knowledge of it more precise and certain. Failed attempts to falsify knowledge are supposed to make us more and more certain of it. Believing in things with zero evidence to even suggest that they might exist is irrational. All I have asked you for is one piece of scientific evidence that breaking this appearant law of nature, the ceiling, is even possible. You have none, so you pivot. To speculative engineering ideas. To social proofs. To reframing my position is somehow "comic book" unreasonable. You are advocating for massive, detrimental and dangerous changes to Bitcoin, based on the unsubstantiated dream of research scientists and investors who are profiting wildly from the hype and have zero results. If you have evidence, put it on the table. I'll wait.
JOE2o's avatar
JOE2o 2 months ago
Which route do you want to go? There are around 60, as I said. Modular is perfectly legit. Or you like Majorana? Do you agree 10 physical Majorana qubits per logical qubit is reasonable? 50? Let's say 30. Do you agree 1700 logical qubits is enough to crack shor's with some further optimisations (there have already been many optimisations) Do you agree that makes 50,000 physical Majorana qubits (talk about arrangement later). If you want to dispute any of those points then go ahead. Or if you want to suggest that what Microsoft is up to is a pure scam, go for it too. If not then make clear you agree with all that and let's move on to the next part.
😂 You’re asking, “Why would the creator have a son and not a daughter?” as if that somehow discredits the physics. It doesn’t. What it does reveal is that you’ve dropped the scientific argument and reached for a caricature of theology you don’t actually understand in attempt to discredit me without engaging the argument. First: you’re assuming biological categories apply to a creator of the universe. In Christian theology, “Son” is not a biological label, it’s a metaphysical relation, not chromosomes or reproduction. Treating it as literal mammalian biology isn’t a clever argument; it just shows you’re arguing against something you haven’t taken five minutes to understand. If your goal was to undermine my reasoning, all you’ve done is undermine your own credibility. Second: you’re confusing symbolism with mechanism. Every domain that encodes meaning: mathematics, logic, computer science all use figurative language that no one mistakes for physical anatomy. We talk about “child processes,” “parent directories,” “orphan blocks,” “sibling nodes.” No one thinks computers are reproducing. You accept metaphor everywhere else, but suddenly pretend not to understand it here because it’s a convenient way to dodge the actual argument in attempt to discredit me. So let’s be clear, you have no rebuttal only a deflection. It wasn’t even a good one; you should try ad hominem next time 😂. Instead of engaging the physics on the table, you reached for a theological strawman you don’t have the background to even parody correctly. All that does is highlight that you have no technical counterargument left. Rekt!
JOE2o's avatar
JOE2o 2 months ago
That is how we define the number. I'm sure you agree the machines we have now work. I'm sure you agree we've got 24 logical qubits in the bag, proven. So somewhere between 24 and X is where you believe the limit is. If you believe it to be over 1,700 you have no shor's argument anymore. If you believe it to be under 24 you also have no arugment anymore. So where is it? 1000? 500? 1500?
So why bring up my beliefs in attempt to discredit the physics you can’t refute? I’m not here to be mean, but I will defend myself. This is not about me, nor is it about you. If time is quantized, the physics you claim is inevitable is falsified. I am stating that we have empirical proof in Bitcoin. Whether you agree is a different story. I’m sorry but if quantizing time breaks the model that’s literally supposed to quantize everything, it’s obviously wrong.
Yes, but it’s not belief. Bitcoin makes this empirically visible. The only reason continuous time feels intuitive is because time is the substrate you’re made of. You cannot perceive the gaps between the smallest discrete updates of reality, just as a program can’t perceive the CPU cycles that execute it. Every update to existence occurs in quantized steps, blocks of time (memory) because without discrete memory formation, nothing can change. Light = information = memory. No time → no light. No light → no information. No information → no memory. Without memory, time has no meaning; memory is time. We already measure time this way without noticing it. A second is nothing more than a human-scale unit defined as a frequency of Planck-time intervals. Planck time is literally expressed in seconds, meaning it already presupposes that time is quantized. A Planck interval is just the smallest tick (a discrete fraction of 1 second), and a second is the standard structured count (frequency) of those ticks. Ie 1 Second is composed of1.855x10^43 Planck Blocks. All time is expressed as a frequency of this fundamental block of time. Bitcoin exposes this structure. Each block is a discrete temporal update created by real energy expenditure. Once you see how an informational system behaves when time unfolds in quantized steps, the illusion of continuity disappears and so does the idea that physics can ignore the discreteness of time. What we have been missing is the working instantiation to provide meaning to something we could not understand.
Pivot pivot pivot. I'm not going to play the price is right with qbits. Somewhere in that gap between 24 qbits held for a few microseconds and ~2000 held for hours to do shor on one Bitcoin key, the ceiling is lurking. But how do we know the ceiling is low? I think that is where you are going. Because we have reproduced in these labs - with cold and quiet and indirection through codes, etc. - the coldest quietest corners of the universe. But you can't get quieter than silence. You can't get colder than absolute zero. And that's where we find the ceiling. Because the physics doesn't change. All we can do is remove noise. But you can't isolate the system from itself. We are assymptoticslly approaching total isolation. Which means we are asymptotically approaching the ceiling. And it is no where near 2k qbits for hours. You can't show me evidence of anything that approaches that scale because it doesn't exist. We will get a really precise picture of where the ceiling is, and that is a cool scientific result. But we will not get a QC that can run shor on a Bitcoin key. We have enough data to say with confidence that that is physically impossible. More every day. I am not surprised by all the prize winners and bag holders and dreamers who don't want to admit this openly. People are weak to temptation. But this is not Bitcoin's concern. Tick tock, next block.
JOE2o's avatar
JOE2o 2 months ago
>Somewhere in that gap between 24 qbits held for a few microseconds and ~2000 held for hours to do shor on one Bitcoin key, the ceiling is lurking. Ok so now we're getting somewhere. Right then, Atom held around 28 logical qubits and ran Berns-Vaz algo w demonstrable error correction, under a second. That was 2 years ago, a lot has happened since. And shor's we need ~2000 and hours. I'd argue that for shor's it's more likley around 1,700, we've already optimised it down by half, and there's more optimisation in the tank. pure math by the way. so by offloading the most complicated steps (modular arithmetic and fraction conversion) to highly efficient classical methods, the quantum part gets streamlined. You're saying somewhere between 28 and 1 second and (if optimisation holds)1,700 and hours is some arbitrary limit of the universe. Atom will ship commercial machines 48 next year. Means them + Microsoft, machiens at 48 logical, capable of running deep, sustained computations for minutes, hours, or potentially days in 2026. (this is all proven out in their lab, to prove me right that's just waiting 6 months to ship. (it is right though) So 48, and running for hours. Where does it stop? At 96 and days? If not at 96 and days then at 192 and weeks? If you can't say where the limit is, or even give a range, based on some physical properties, then you are basing the limit on nothing. You have to give a number to show you've got some physical basis for your impossible statement.
You should be able to understand this from first principles and you are still squirming, but here is the equation: Lindblad master equation (The equation for self-decoherence.) Γ_self ≈ γ N² Even with perfect isolation (γ → 10⁻⁸ s⁻¹), N² × 10⁻⁸ × 3600 ≪ 1 to stay coherent 1 hour → N ≲ 170 qubits max
JOE2o's avatar
JOE2o 2 months ago
Lindblad? That's your card? Maube for NISQ machine w/out error correction, from back in the good old days. What Atom is doing (again one of 60 or so paths being explored) keeps things way far from Rydberg state. LIke the entire purpose of QEC is to overcome the physical limits of T2 (coherence time). When you constantly measure and correct in a small group of physical qubits, the lifetime of the logical qubit gets exponentially extended, making the Lindblad N2 decay irrelevant for the computational unit. I get the feeling this path you're on is all stuff coming out of the NISQ archives or something. I mean the fact that atom is *already* using 1,200+ physical qubits to build 28 logical qubits with a better-than-physical error rate is empirical proof that that your Lindblad limit is obsolete in a computation context
On Portuguese cigarette warnings they use the same word Filho which is normaly also used as in Pai, Filho, e Spirito Santo, to refer to children. Just go search the web. Portuguese is one of the closest languages to latin still in use. That bible translation should be ready in context of idiom of the time of translation, and the place. Child of God. Same as all of us.
You just made a very substantial false statement. Lindblad is the gold-standard description of open quantum systems. It has never been falsified. Every attempt to scale past a few hundred physical qubits confirms it: N² correlated decoherence wins. QEC doesn’t repeal Lindblad. It adds more measurements and makes the N² term worse. Show me one paper where Lindblad is falsified or QEC removes the N² scaling. Either show me a paper that disproves Lindbald or admit you are wrong. No more dodging.
JOE2o's avatar
JOE2o 2 months ago
I have concluded you do now know what a logical qubit is.
Logical qubits are an engineering workaround to buy time by avoiding direct measurement. Price tag: more physical qubits + constant syndrome measurements = **more self-noise**. By physics, they **lower** the ceiling, not raise it. Lindblad is fundamental. Shor on a Bitcoin key needs ~2,000 qubits in one global wavefunction for hours. Now strip away **every** engineering problem. Give me the platonic ideal QC: - absolute zero - perfect vacuum - zero cosmic rays - silent measurement - no logical qubits needed Even then, **self-decoherence alone** (Γ ≈ γ N²) caps you at ≤170 physical qubits and coherence collapses in an hour. **SELF-decoherence.** You can’t isolate the system from itself. That’s not an engineering limit. That’s the universe saying “no.” Get it now? I'm not trying to roast you. The engineering is dazzling. But Lindblad's formula is the relationship between quantum stuff and classical stuff. All the skyscrapers of data we have track to that simple formula. And this is why I care and why I learned about this: we can't be mangling Bitcoin or scaring people away from ECC freedom tech over something that can't happen. They are way too important.
JOE2o's avatar
JOE2o 2 months ago
You are completely misunderstanding physical qubits<>logical qubits. If you understood what a logical qubit was you would understand that QEC doesn’t “repeal” Lindblad. Also repeal is a legal term, are you a lawyer? That would explain a lot. Lindblad itself is fine as far as math and physics goes. It’s about limits imposed by environmental decoherence. Great. Key point though: if you have a way to pump coherence back into the logical qubit faster than the physical environment can drain it out, then this limit, which is again fine in itself, simply does not come into play. What atom and many others have *already* done is empirical proof that the pump works, so to speak. You cannot say oh such a pump can never be built, because they exist today and are proven to work. And you cannot say the the resulting logical (yes logical) qubits can't preform the right knid of computation, because that's also proven. You've been proven out of an argument. The fact that you’re misunderstanding this as “falsifying” (or, er, “repealing”) the Lindblad limit, as opposed to simply removing the need to worry about hitting it, makes it pretty clear that you don’t understand what a logical qubit actually is. It’s like you’re saying there is a physical limit to how fast a human being can work an abacus. This is provably true, you can keep that one in your bag. But then you go on to claim that this “abacus limit” in turn limits the complexity of the mathematics that our species can do. Except, hello calculators and computers. Dunno. It’s like debating OP_RETURN with someone only to find out at some point that they don’t actually understand what what a UTXO is. How far can the debate really go?
You have not engaged in any debate whatsoever. I told you something you didn't want to hear. You either have QC stocks in your bag or you are just high on the scifi. Dodges, childish insults and quantum marketing babble do not constitute "debate". You can't "pump" decoherence. That sounds like it came out of a badly written Startrek. That's just some cringe QC marketing metaphor and not at all accurate. Logical qubits trade size for time against local noise, and only against local noise. Against self-decoherence (N²), they make it quadratically worse. It's not magic and it does "pump" anything but quantum schill bags. Lindblad is the equation for ALL decoherence, including the N² self-interaction term that dominates at scale. Qbits decohere each other. You can't remove that, because that's how they participate in the same wave function. That's why there is a ceiling at ≤170. That's the whole enchilada. I'm tired of repeating myself. Are you going to say I work in a food truck now? I've explained the same thing to you 20 ways and you have had zero valid arguments. I've given you first principles and even the formula and you just dodge and blather. Now you've resorted to petty insults.This has become far too childish and I'm not going to give it more of my time. Good luck with your quantum bags and scifi fever dreams.
JOE2o's avatar
JOE2o 2 months ago
I'd pack my bags at this point too if I were you. Let me leave this for anyone who comes across this thread, hoping for a neat conclusion. You have posted a Lindblad argument that assumes the noise scales quadratically. That quadratic scaling is the basis of your N<170qubits. Atom and microsoft have demonstrated logical qubits operating at an error rate four times better than the physical error rate. Other global teams have demonstratd similar. These results are empirical proof that the noise in their systems is correctable, and therefore does not scale quadratically, and therefore the actual noise profile is not following the eventually catastrophic scale path that defines your math (that ends up bing-bong at170) It's one or the other: Either atom and microsoft and quantinium and harvard and everyone else are lying and it’s all a scam—or your argument has already been proved wrong by experiment. Both these things cannot be true at once, that is impossible. Passing reader: which one do you think is true? .
JOE2o's avatar
JOE2o 2 months ago
Dodge accusation as a dodge, how very novel. You have no answer of course. You know you can't say they're all scamming.
JOE2o's avatar
JOE2o 2 months ago
Summary of the topic you cannot confront: You (quoting) “QEC doesn’t repeal Lindblad. It adds more measurements and makes the N2 term worse.” Results of multiple real-world experiments: “Hi! We provide irrefutable empirical proof that QEC makes the N2 term considerably better, across the board!” Classic case of real world experiments forcing theorist back to the drawing board. Drawing board’s over there.
You are falling for weasel words, Joe. That's how they weaponize your lack of expertise to get you to draw the wrong conclusions and keep you on the sauce. "Better N²" means they reduced noise (very locally) and plugged a smaller number into the equation. That doesn't change the equation. Again, we arrive at the max of 170 under the maximally generous assumption that they get those factors to zero. This means nothing but "we did a better isolation". BTW, I was going to give you a consolation zap and I couldn't. Set up you wallet, broham.
JOE2o's avatar
JOE2o 2 months ago
Nostr has too many features. I’m building a soft fork with no DMs, no zaps, no reactions, no media. Just text. you can zap some random nostr wierdo in my behalf. And this better isolation + smaller number attempt to retroactively add an asterisk after your "QEC makes the N2 term worse" from earlier, pretty sure you know that can't fly. Under you assumptions, the error rate for the logical qubit has to *always be worse* than the physical qubit, no matter how good the isolation is. But look, it's actually better. Also the logical lifetime of the logical qubit would have to always be shorter (for Google's Willow it's like 3x longer). The results prove irrefutably that the N is *not* the governing factor in these QEC systems at all. To argue that the ceiling remains at 170 despite multiple results showing the logical error rate is better (yes better) and the lifetimes longer (yes longer) than the physical is to break your own math. It's correctable, exponential scaling, proven out by experiment. Not the uncorrectable, quadratic scaling your math depends on, and that you just broke with your asterisk anyway.
Can you please provide links to the best evidence available for the statements above? I'm a QC skeptic, but I'd change my mind if presented with evidence to the contrary.
Just ask chat gpt what happens to the formalism of QM/QC IF time is quantized and discrete. Ask it how it can take the derivative of time if there is an invisible tick. Ask it what becomes of superposition and decoherence if time has an indivisible tick. Just understand the assumption that backs the threat model, then you can decide what you believe. You can’t logically believe in both Bitcoin and QC, you must choose one as they require incompatible models of time.
It's also just patently false to call QM settled science at all. Non-locality was NOT resolved by the Bell's theorem experiments. Superposition is still nonsense. Einstein and Schrodinger were not wrong. Bohr did not "win" the debate. There has been nothing but obscurantism and verbal tapdancing around the problem. The Copenhagen interpretation declared victory, wrote nonsense into all the textbooks and told everyone that they just weren't smart enough to understand it and if they wanted a career they better "shut up and calculate". Sound familiar? That's how we ended up building on sand and ignoring the fact that the best minds in physics in hundreds of years plainly and simply showed that QM as we know it is a broken theory that needs to be replaced with something that actually solves nonlocality. (And not by resorting to "many words" hogwash). QM is not even close to being settled the way that Newtonian physics and relativity are. It's a jumble of quasi-mystical jive maintained by social fear. And the PQ migration push is a motivated social attack on cryptography, based on an unfalsifiable, nonsensical threat, based on physics that are known to be broken. "Reality isn't real and things are nonlocal, which, ignore the contradictions and nonsense terms, it's true despite nothing being real and you can't ever measure it but trust me bro, you're just not smart enough to understand it." This is fiat science. It's true because we say so and you will get punished if you question it. Now shut up and calculate. And take your vax, pleb. You're not a virologist! No. Don't trust, verify.
JOE2o's avatar
JOE2o 1 month ago
What specifically? The fact that we are able to do genuine quantum computation with what we've got so far is VERY public knowledge.
JOE2o's avatar
JOE2o 1 month ago
You seem lost in a world of "maybe nothing is true at all". (It's not just you, so don't feel bad.) The results of the experiments speak for themselves. Either you posit that all the universities, labs, journals, etc., are faking results as part of some massive quantum FUD conspiracy, or you accept those results and reform your understanding of the how the universe works around them.
JOE2o's avatar
JOE2o 1 month ago
You're the guy who says no core cryptographic element of any blockchain on the planet earth, including what are often referred to here on nostr as shitcoins, will ever be cracked by a quantum computer.
My friend, you are like a whole a garden of logical falacies. And your arguments keep boilimg down to "lots of people believe this, therefore it must be true." It's not an argument and its lazy. And boring.
JOE2o's avatar
JOE2o 1 month ago
This coming from the 170 logical qubits is the absolute speed limit of the universe and here's my formula and I'm the only one ever to have worked this out guy. Spare me.
JOE2o's avatar
JOE2o 1 month ago
I get that you love bitcoin more than anything on earth and you don’t want mean old Mr. Quantum to hurt her, but for all your logic to emanate from from this emotional (and kinda weird) part of your psyche does not make for enlightening debate. All your posts are just one long teenage love letter repackaged into science-mush.
JOE2o's avatar
JOE2o 1 month ago
Someone could code up a "coin wallet" right now that could be easily be cracked with a quantum computer doing quantum computation. It’d be a super weak key (maybe 22-bit RSA or ECC or something), but it would be a demonstration of real quantum computation cracking a real ‘wallet-style’ key using real quantum effects, and provable. Any classical computer could crack that key too, of course, though not by making use of entanglement.
If time is quantized and discrete they could not perform any QC. You still haven’t shown me your answer to the question. I’m not sure you can so please use an AI and paste it here to show you are being honest and engaging with my question. You refuse to address it. “what happens to the formalism of QM/QC IF time is quantized and discrete? How it can take the derivative of time if there is an invisible tick? What becomes of superposition and decoherence if time has an indivisible tick? What becomes of QC if this is all true?”
JOE2o's avatar
JOE2o 1 month ago
What happens if you turn a cello inside out? Your questions don't mean anything. Think in terms of actual experiments. That's how science works. Experiments. And guess what, these experiments have all been done, you can review the results for yourself. These are answered questions, you're just ignoring the answers.
You are blatantly ignoring the question lol. Stop being dishonest, please. Ok, so please show me the experiment that empirically proves that time is infinitely divisible.
So you can’t show me the experiment that proves time is infinitely divisible? I just want you to admit you are being dishonest. You can’t point to the experiment that everything you claim MUST REQUIRE but you’re a man of experiments. So you are trusting an unproven axiom, and my only point is, if that axiom is wrong, that time is quantized and discrete rather than infinitely divisible, every “advantage” you want to claim collapses as does the mathematics supporting said theories. Yet you are unwilling to engage or admit this, stop being dishonest, please address the question.
JOE2o's avatar
JOE2o 1 month ago
No you're just being bokners. It’s like we’re discussing whether Pepsi causes burps. I’m saying: “Lots of people have been witnessed drinking Pepsi and then burping, we can even test it ourselves.” You’re saying: “The letter P cannot be proven to exist and that is the first letter in the word Pepsi ergo there is no such thing as Pepsi and so it’s impossible to burp from it.”
I’m simply asking you to provide me empirical evidence that supports the claim assumption that time is infinitely divisible, and you won’t. So therefore you are assuming that it’s true. But, if time is quantized and discrete, everything you claim falls apart. The assumption of time is beneath any physics or experiments. You refuse to engage in providing evidence, you refuse to admit you are assuming something to be true that has NEVER been proven, and you won’t even discuss the outcome if that assumption was wrong. I guess it’s hard to be honest when you ask someone the right questions.
JOE2o's avatar
JOE2o 1 month ago
Look, your position isn that quantum computing doesn’t exist. Yet in the real world we have quantum computers doing quantum computing. How are we supposed to have a discussion in light of that contradiction? Either we resolve that or there's nothing to say.
JOE2o's avatar
JOE2o 1 month ago
Send this grump your 170-qubit absolute speed limit of the quantum universe breakthrough research paper! He might invite you to New Zealand for some mutton and mint sauce.
JOE2o's avatar
JOE2o 1 month ago
Every field has grumps (this grump is actually genuinely funny, a well-liked grump you might say). Look at AI, there’s some clip of the grump Lecun listing off a bunch of things a 2 year old can do but that LLMs will *never ever ever* be able to do because of the fundamental limits of what an LLM actually is. Spoiler alert, LLMs can now do every singe one of these things and Lecun has now been pushed out of Meta due to Mark Zuckerberg feeling kinda embarrassed about it all.
Can you provide a link that provides evidence that genuine quantum computing has happened? Not press releases written by marketing departments, but something like scientific papers or live demonstrations.
I read the paper. It only further reinforced my skepticism. I don't want to take the time to write a detailed critique, but I think the first peer reviewer's opinions are on point: https://static-content.springer.com/esm/art%3A10.1038%2Fs41586-025-09526-6/MediaObjects/41586_2025_9526_MOESM2_ESM.pdf The QC "calculations" are totally made up to justify the QC actually doing something beyond random noise. There is zero connection to any practical computation. I can't believe how many people fall for this shit. It reads like a parody of Eric Weinstein's Theory of Everything. However, the commitment to the bit is impressive.
JOE2o's avatar
JOE2o 1 month ago
Your position is somewhat unclear. Are you disputing the raw data or the interpretation of the raw data? And, at large, are you saying quantum computing itself does not exist. Or that quantum advantage does not exist? Or that both do exist, but there are fundamental limits to, say, how many logical qubits we can get to, or how many gates? Very few people I've talked here actually have a clear position on this, so I think it's fair to ask.
> Are you disputing the raw data or the interpretation of the raw data? As reviewer #1 stated, it's incomprehensible techno-jargon babble (paraphrasing). It's impossible to provide a rational and coherent critique to nonsense. The only possible critique is calling bullshit. > are you saying quantum computing itself does not exist. Or that quantum advantage does not exist? They're doing some interesting physics experiments with no connection to any useful computation. It "exists" but it's nonsense. No QC "advancement" has plausibly shown quantum advantage or a path to utility (see the other reviewers, which agree). > are fundamental limits to, say, how many logical qubits we can get to, or how many gates? Even if you buy into the bullshit completely, theoretically, many orders of magnitude more qubits are required to do anything useful. Every added qubit adds additional thermal load, noise, errors, and complexity. The capital investment required for a single useful QC would probably be higher than the current AI spend (assuming it's even possible, which is highly doubtful, IMO) > Very few people I've talked here actually have a clear position on this, so I think it's fair to ask. I did a deep dive on QC years ago when they were trying to run Shor's algorithm. They've apparently given up on that since they failed to compute the prime factors of numbers as small as 15 and 35 (which any grade school child can easily do in their head). There were also many ex researchers and ex QC engineers that effectively spilled the beans that it's total horseshit, some more politely than others. Here's just one example from a Cambridge PhD: Now, as far as I can tell they've moved on to things like OTOC and now OTOC^2 which they can't even explain without sounding like Eric Weinstein. It appears to be just looking at the behaviour of the qubits themselves vs any I/O. I've previously concluded that QC was BS, and therefore I'm not motivated to try to unravel this new phase of BS that is much further removed from legitimate mathematics and algorithms than the previous work. It appears to me that they're now working on "Weinstein" algorithms because it makes it much more difficult for people to call BS. People hear the super complicated techno-jargon and just assume it's over their head. Peter Thiel funded Eric Weinstein's work, so even very smart people can be fooled by this type of scam. I personally think it's a massive waste of time to "quantum proof" any software that relies on time-proven and well-reviewed encryption or trapdoor functions before a single QC project can plausibly show that they can run Shor's algorithm successfully on a small number such as 35. So, my ultimate advice is: when they can prove that they can factor small numbers with Shor's algorithm, then it's somewhat reasonable to have a concern about a future capability to factor much larger integers. Until that happens, it's a waste of time and resources. To go even further, the underlying GR/SR/QM theories are lacking and unproven, IMO, but that's a different, very complex conversation.
JOE2o's avatar
JOE2o 1 month ago
I think there is some conflation going on here. Referee #1 is taking issue with the methodology to demonstrate quantum advantage, not the technology itself. (The referee states being impressed with the technology itself.) I agree that demonstrating quantum advantage is messy, and the methodology here is convoluted. Essentially you're trying to prove you got a right answer in an environment where no other technology is able to check your answer, and that's never going to be clean. Reviewer 2 "extremely impressive experiment", etc. I think I read the word impressed 20 or 30 times in the reviews. Or course on the fringes of every field you get the bullshit callers (I remember them very well in the world of machine learning 10 years go; they are all very quiet now) but the general the bullshit callers are on the fringe here. I don't think there's any question in the field at large whether 2025 was a breakthrough year, particularly with what Harvard and all demonstrated in December. You state you've concluded it's bullshit. That's seems rather close minded, but fair enough. In that case the only thing is to wait for the results of the experiments in 2026. If something happens that makes it pretty obvious that this is potentially a very real thread to encryption over the next 10 years I'll jump back on this thread to see what you think.
Appreciate the earnest conversion and a good place to conclude: we'll see. Yes, I think it's complete BS, including the fundamental principles that it's built upon. In my opinion, it's just another money-pit boondoggle like CERN and the vast majority of quantum theory and mathematics (string theory etc.) When people start talking, and publishing paradigm-shattering papers solely based on thought experiments like trains moving at light speed, bowling balls on trampolines, and dead cats in boxes, that's a signal to take a hard look at the math and evidence. I'd change my stance if presented with legitimate, verifiable evidence, but this latest paper certainly doesn't make the cut. If someone can't explain something in simple terms, then they don't understand it or they're just bullshitting. Many people point to all of the PhDs and funding as evidence, but I see these as a motive to continue the con.
JOE2o's avatar
JOE2o 1 month ago
Fair take. My take is that fusion, quantum, gene editing, etc., these are in the category of "all hype and bullshit--until they aren’t". AI is in the same category, just further along. It was all hype and bullshit, I remember very clearly. And now it isn’t. Sometimes such things can be in “all hype and bullshit” status for decades and decades--until suddenly they aren’t. Case in point is CRISPR, this sort of gene editing was “around the corner” for decades, and everyone got tired of the hype, all the bullshit--until suddenly one day we were actually around that corner and it wasn’t bullshit anymore. I suspect the same will be true for fusion, my bet would be on the plasma controlled with AI stellarator path. Everyone is just making best-guess predictions, any of which can be shattered the next morning by the results of a new experiment.