Bitcoin will survive post-quantum, but it will kill Nostr.
Login to reply
Replies (104)
Can you explain, or provide more detail?
Cloudcoin will laugh at quantum, amd nostr wont care.
Bitcoin will kill nostr?!
Nobody really knows what quantum is really about and if it's feasible at all.
At this stage it is nothing more than vacuous promises and shitcoin-style fear-mongering.
I suggest you look deeper than superficial FUD before making such claims.
You can just transfer your Bitcoin to the new wallet type they create. You can't transfer your Nostr identity to a new key.
Bitcoin won't survive if a successful Shor's attack happens before Bitcoin's quantum immune system has developed to a specific degree. Right now nobody can say whether the virus will hit before or after the immune system is at that point (if the virus will hit at all).
But one can make a very strong argument that it hitting before is within the realm of possibility, and that as a result that would mean the end of Bitcoin.
QC is pure FUD.
Quantum computers can't scale.
Because at scale, thermodynamics kicks in.
At scale, the cat is alive or is is dead.
nostr:nevent1qqs97qhj03szcjc6z25tszlg694a2wzmkaya5ecx2t2qg796hu9tu3szyrzrdrz39ecwxe2clgt8je7dw07g829fql4r3vlddq6clj7l4vx6vqcyqqqqqqgndsahx
Heard constantly about the level of AI we have now being "impossible" for several years. Guess what, unexpected breakthroughs happen.
A bitcoin-ending quantum attack is well within the realm of scientific possibility, that's the main thing. Nobody can say whether it will or won't happen, or if it does then when, but the fact that it's well with the realm of scientific possibility should be enough to make it the absolutely #1 priority for Bitcoin, well over this nonsense like OP_RETURN.
Yes you see human nature at work though . We can be a petty species, and the operating system for Bitcoin at the end of the day is the human brain.
This conveniently leaves out three key facts.
1) Breakthroughs in math are just as dangerous as breakthroughs in hardware. We don't know the algorithms we don't know. Shor's is a very new discovery in math, a baby. It was there all along, we had no idea. Can we optimize it, reducing the number of logical qubits needed by a large degree? Likely we can. We can also use AI to devise the classical algos to weed down the input to Short's without even needing to optimizing Shot's itself, for considerable gain. Likely we'll do both at once. We also don't know if there are other quantum algorithms out there that enable more efficient factoring methods, as the space of quantum algorithms is woefully unexplored, and it's entirely possible there could be a paper published tomorrow that changes everything. (We don't even know if qubits themselves are the way to go or if there is something exponentially better.) Basically there's a lot we just don't know in the math side here.
2) The majority of research on the hardware side is military/weapons research, advances in which we will not hear about. China in potentially investing 5 to 10 times as much in quantum research as all known US private-sector investment combined. But it's mostly all very quiet.
3) We have no idea the impact of upcoming AI discoveries on error correction using messy qubits. AI solved protein folding, and that was years ago. We can make messy qubits at scale with the current hardware, and if it turns out that AI-supercharged error correction enables a much greater degree of messiness then that changes everything.
Basically the threat is real, and there's no sense sticking our heads in the sand, despite that being a thematically appropriate approach for nostr.
what's the solution?
so we party like it’s 1999, babies!
🎉🥳🍾🪩👯♂️👯♀️💃🏻🕺
Not to mention that it takes an magnitude more in operational costs just to run what we currently have.
Shor’s algorithm depends on a fractional-reserve model of physics treating unmeasured potential as if they were real computational substrate.
It’s ironic that Bitcoiners reject fractional-reserve money as fundamentally unsound, yet they’re literally placing their trust in that same fractional-reserve illusion when it’s packaged as a sound model of computation.
You can’t call fractional reserves a scam in money and sacred in physics. Pick a side, you’re double spending your beliefs.
Use Bitcoin to falsify the fractional reserve ontology of physics quantum theory relies on to claim that there is a threat.
The scammers in Bitcoin have always been the ones who have tried to convince you it was broken to sell a fix. Trust the physicists my friend, Bitcoin and Nostr are cooked. Just don’t ask them for proof beyond a theory.
Trying to compare quantum states to fractional reserves certainly wins the cake. One is a physical law of the universe, the other is social policy. Also trying to imply that there is some sort of systemic default in the collapse of the waveform is ... my goodness me.
It's the conclusion of the computation, guaranteed by the mathematics of quantum mechanics.
Quantum computers work, we know this. Like it or not.
The entire ontology of quantum computing depends on a very specific claim: that a qubit literally exists in multiple states simultaneously prior to measurement. But this assumption is never physically defined. “State,” “exist,” and “simultaneous” are treated as if they were primitive, self-evident concepts. They are not. In physics, simultaneity has no meaning without a temporal reference frame and a smallest definable unit of irreversible change. If you cannot specify the tick of time against which “occurring at once” is measured, you cannot claim simultaneity at all. The smallest meaningful interval is the Planck time, yet Planck time has never been measured, operationalized, or instantiated in any experiment. Quantum computing therefore rests on an unverified assumption: that amplitudes evolved by the Schrödinger equation correspond to physically coexistent states, rather than a probability distribution over potential outcomes.
This matters because the wavefunction is a predictive tool, not a physical ontology. It gives you probability amplitudes evolving on a mathematically convenient continuous time parameter. It does not tell you what time itself is. It does not describe the physical mechanism of measurement. It does not define observation, does not provide a criterion for the boundary between potential and actuality, and does not resolve when or how collapse happens. Every interpretation: Copenhagen, Many Worlds, GRW, Bohm adds new metaphysics precisely because the wavefunction alone cannot specify reality. When you claim that collapse is “the conclusion of the computation guaranteed by the math,” you are smuggling in a metaphysical narrative that the math does not provide. Probability distributions are not proof of physical coexistence; they are only statements of uncertainty in the absence of measurement. Treating them as real, usable computational resources requires an ontology that has never been empirically validated.
The ambiguity is worsened by the reliance on continuous time. The Schrödinger equation presupposes a smooth temporal backdrop, but no experiment has ever verified the continuity of time. All physical measurements occur in discrete, irreversible events, thermodynamic transitions, atomic interactions, radiation absorption, or clock gating in quantum devices. If the universe evolves through discrete quanta of time, then “simultaneity” collapses conceptually: states cannot coexist “at once” if the universe only updates in discrete increments unless they occur at the same Planck Block of Time. Without a defined temporal substrate, the claim that quantum computers manipulate “many states at once” is not physics but an interpretative convenience. It’s equivalent to treating a prediction domain as a physical storage medium. This is the fractional-reserve ontology: unredeemed probability is treated as physically real capacity.
Bitcoin exposes this conceptual mistake by providing a concrete, empirical model of discrete measurement. In Bitcoin, unconfirmed transactions represent potential, eligible but not realized. They are unmeasured quantum states literally by formal definition of the word. Only when a miner expends real energy to commit a block does the system undergo an irreversible collapse of entropy into a definite state. The block is the discrete quantum of time within the system, the moment at which potential becomes actual, and memory is written. Nothing in Bitcoin is treated as real unless energy has been committed to make it real. This is precisely the physics quantum theory has not formalized: the relationship between energy expenditure, entropy reduction, time, and the emergence of definite outcomes. Bitcoin is the only system in existence today that performs this collapse in a controlled, thermodynamic way, producing an auditable sequence of discrete measurement events, something no quantum computing experiment has yet replicated or operationalized.
Quantum computing requires something Bitcoin categorically proves does not exist: scalable, physical simultaneity of unmeasured states. If you cannot define measurement, you cannot define coherence. If you cannot define time at the smallest scale, you cannot define simultaneity. And if unmeasured states are not physically real, they cannot serve as computational resources. Small-scale interference experiments do not demonstrate large-scale ontological validity; they only show that microscopic probability structures behave coherently when isolated under extreme conditions. They do not show that probability amplitudes represent physically existent parallel configurations. They do not show that coherence scales. They do not show that continuous time exists. They do not show that collapse is a computational resource rather than a thermodynamic one.
Bitcoin is not a metaphor here. It is the empirical counterexample: a working system where measurement is discrete, collapse is real, time is quantized relative to energy expenditure, and nothing is treated as “existing at once” without proof-of-work. If your quantum threat model depends on simultaneity you cannot define, time you cannot measure, and states that only exist as mathematical potentials, then the flaw is not in Bitcoin, it is in the ontology of the model you are defending. Bitcoin simply reveals it.
You’re double-spending your beliefs. You can’t logically support Bitcoin and centralized quantum computing at the same time because the physical ontologies they require are fundamentally contradictory.
Time to pick a side. Bitcoin, not quantum.
Wait until Bitcoin chooses a new type of keys, and then switch to that. Everything will be okay, but all existing Nostr identities will be nuked.
True, but the solutions are already on the table. Bitcoin could adapt in a matter of days if something happened.


tl;dr
Bitcoin, not quantum.
Bitcoin is an instantiation of physics that demonstrates the collapse of superposed states (a mempool - surface of potential configurations) into one singular measured block of time where each state is deterministic.
A quantum is literally just a portion of the whole, you cannot have a logical quantum without absolute mathematical finitude you can verify at each step of time.
Bitcoin disproves physics of the threat.
Wouldn't help.
If a well-prepared team (say in Asia) had a quantum machine capable of running Shor's to break sigs they could potentially take 30% or more of total Bitcoin supply within the first day. That would be a knockout effectively. They would have pre-cracked the private keys of tens of thousands of the richest vulnerable addresses, and likely a lot of the smaller ones, down to a pretty low bar. And any attempt to move bitcoin to a wallet without an exposed key while this mayhem is going on is open to active interception. Basically 30% is guaranteed, 50% is on the cards, and it would be over before it began, assuming the goal is to end Bitcoin and cause chaos (potential military goal), and not to slowly and quietly get rich.
That again assumes it happens before Bitcoin has any real immune system in place. Which hopefully isn't the case going into to 2030.
Sounds like a Deepak Chopra speech, sorry. Doesn't make any sense
The only thing that doesn’t make sense to me is Bitcoiners who claim fractional reserved bookkeeping can’t produce sound money, but fractional reserved qubits (1 object, many states) can produce sound computation/sound physics).
Just observe Bitcoin if you want to understand QM at the Planck Scale. Bitcoin is pure logic of stepwise determinism constructed from superposed states.
You are thinking in analogies. I had a Bitcoiner friend who went insane and had to take a year off before resuming his work because he got too deep into the rabbit hole and started seeing connections where there were none. I'm not trying to insult you, but only relation between a fractionally reserved balance in a ledger and the quantum state of some sub atomic particle is some of the words used in the English language. They aren't related in any other meaningful way
Just forget the words for a second and try to understand what you saying, it's doesn't make any sense
I'm sorry, I see you are really into this, but this is unscientific wiffle-waffle. We have empirical evidence that quantum computing (advantage) is real, that the core principles (superposition, entanglement, interference) can be harnessed to perform calculations that defy the limits of classical computers. Question of whether a quantum machine can be fundamentally faster than a classical one on certan problems is settled in theory and demonstrated in principle. Questions reaming are just about scale.
Banger
but wut, I mean this is a philosophical argument that I'd accept as valid refutation of quantum computing being a good monetary technology... soo it doesn't defeat the idea of Bitcoin in anyway, but it could very much defeat the technology of Bitcoin? Digital communication didn't defeat gold, it just allowed fractional reserve banking to break the technology of gold
Bitcoin is not unscientific wiffle-waffle. It is the literal physical proof you are wrong. You just refuse to look at it for what it is. We can play ball tho:
Let’s take your ontology at face value.
If quantum computing is physically real in the strong sense you assert where unmeasured states exist and can be used as computational resources, then Bitcoin already possesses more “qubits” than any quantum computer on Earth. In your framework, a system does not need to define simultaneity relative to a temporal resolution, nor does it need to define measurement physically. Potential configurations count as existent configurations. By that logic, every unmined UTXO state in Bitcoin is a valid, physically real element of a quantum register.
Today there are roughly 166 million UTXOs. Under your ontology, that means Bitcoin has roughly 166 million qubits. Nothing stops us from broadcasting thousands of conflicting transactions for each UTXO, each one representing a distinct potential state. Since, in your view, potential (unmined transactions) = existence, all of these contradictory states “exist” simultaneously until measurement. If miners simply coordinate to avoid mining those transactions, the superposed set can be maintained indefinitely through time. There is no collapse until a block is mined, and since you do not define measurement physically, a mined block is just decoherence, not a finalization of reality. The ontology therefore allows indefinite coherence: we can maintain an arbitrarily large superposition simply by preventing confirmation.
If your definition of superposition is correct, we can run quantum algorithms, including Shor’s now, on Bitcoin itself. We can break Bitcoins cryptography with Bitcoin. We don’t need new hardware; we need only to preserve the superposition by keeping transactions unmined. We can treat UTXOs as the quantum register, conflicting transactions as amplitude components, and the mempool as the Hilbert space. According to your premises, this is not absurd but physically legitimate: unmeasured states are computational states. The ontology provides no criterion that distinguishes quantum amplitudes from unmined transactions. Both are uncollapsed potentials evolving in time. If we maintain their uncollapsed condition through social coordination, we have “coherence” for as long as we want. In your framework, nothing prohibits this or makes it physically meaningless.
But here is where the contradiction emerges. Bitcoin is an actually-instantiated physical system. It enforces conservation at every discrete step of time. A UTXO is not a spread of amplitudes; it is a deterministic, singular state until a new block is produced. Even if multiple conflicting transactions circulate, the base state does not multiply. It remains one unspent output. The system’s discrete temporal structure ensures that potential does not become parallel existence. It remains potential. When a block is mined, the system does not “decohere”, it collapses into one irreversible outcome, because every block is a physically paid thermodynamic event. There is no physical mechanism in the Bitcoin universe that corresponds to “quantum error correction” restoring potential after collapse. Collapse is final because energy is spent.
What this exposes is not a flaw in Bitcoin but a flaw in the ontology you are defending. In a discretized system whether Bitcoin or the physical universe, superposition cannot be continuous or ontologically real. It can only exist as a set of mutually exclusive future possibilities, not as parallel actualities. Every discrete quantum of time removes ambiguity and produces a single deterministic state. This applies even if a transaction sits in the mempool for hours: its underlying UTXO remains a singular, unspent state until a block instantiates a new one. The unmeasured future does not multiply reality; it merely defines possible successors. The structure remains deterministic at every discrete step.
When you apply your ontology consistently to Bitcoin, you reach an absurdity: you must treat contradictory ledger states as simultaneously real and reversible, even though Bitcoin’s physics forbids that. When you apply Bitcoin’s ontology to quantum mechanics, you reveal the error: superposition is not physical simultaneity; it is a description of unresolved potential awaiting a discrete, irreversible measurement.
This is the reconciliation you’ve been missing: Quantum superposition is discrete and potential, not continuous and existential. Every quantum of time produces a deterministic state even for the “unmeasured” by finite deduction.
Bitcoin instantiates it openly for anyone to observe. Any ontology that demands otherwise contradicts itself the moment it is applied to a real system with conservation, irreversibility, and discrete time.
You cannot prove the modern definition of superposition is true a the Planck Scale of time, there is zero empirical evidence. Youve trusted a narrative and Bitcoin is the measurement and the proof you’re wrong.
History shows what happens when you underestimate Bitcoin. All of Physics is next.
You're lost man, you have to pull yourself back from the abyss. Take care of yourself, try and get a good sleep, have some tea, and day by day pull yourself back a little bit.
Why wouldn't quantum safe cryptography be brought to nostr? It would be far faster to adopt since there's no consensus to worry about
I’m grounded, Bitcoin is my proof. Your ontology is not you’ve literally just trusted a narrative. Until you can define simultaneity at the Planck scale, the framework you’re invoking has no physical foundation. Without that definition, your interpretation of superposition remains an unverified assumption rather than established physics.
“And he said, Go, and say to this people: Hear indeed, but understand not; see indeed, but perceive not.” — Isaiah 6:9
I’m not the one defending the idea that Bitcoin is broken. Good luck with that.
Either quantum physicist or schizophrenic
Well that explains your desire to fuse bitcoin with spiritual transcendence.
Bitcoin is very earthly, I'm afraid.
The quantum computing advantage has been experimentally proven. If you want to learn how it's been experimentally proven, I suggest you do a bit of research, it's interesting stuff.
Or maybe you just haven’t thought about the meaning of these words, nor have observed Bitcoin for what it objectively is, or what value even means.
Maybe you are stuck in the “just money” definition. Who knows.
I’m not the one saying Bitcoin is broken by any means. Bitcoin is the threat.
No, Bitcoin openly falsifies your beliefs.
Show me another system in physics that openly measures the isomorphism of Boltzmann Entropy (heat/Kelvin) and Shannon Entropy (satoshis/utxo dataset) where the object is both the memory and the time of the relationship.
No physicist can define existence, quantized time (thus simultaneity), measurement, observer.
Bitcoin is the proof physicists could never produce.
Or Bitcoin is the technology that replaces physical theories with an open verifiable instantiation that anyone can literally use.
Why are you so convinced Bitcoin is broken?
Oh well. Nuke em!
Who said Bitcoin is broken? The technology risks becoming obsolete if we don't push forward post-quantum encryption, that is all. I think your post confuses the tech and the idea, or... readers confuse it for themselves and you're clear about the idea being unbreakable, though the tech may break if we do nothing.
“The technology risks becoming obsolete if we don't push forward PQC,”
You just did.
The proof you use to claim “the tech may break” is falsifiable.
It is far more probable that Bitcoin would be undermined by modifying it to counter a threat model built on physics that Bitcoin itself invalidates.
You really think we'll actually be able to make a computer that can break ECDSA or schnorr?
I think it's FUD. I don't think it's achievable.
why so absolute? just need to have an alternative ready and tested for when it is a threat... Bitcoin is an ideal and can remain perfect in that, and technology evolves so we upgrade to maintain the course towards the idea - simple as that to me
Capitalism at some point was undermined by lack of tech to enforce sound money, let's not lose to that again
Plus if you don't resuse addresses, pubkey isn't revealed and would require sha256 be broken. I think that's impossible really. There's no evidence to suggest breaking ECDSA is even possible.
It's motivated reasoning. He sees whatever he wants to see and disregards anything that doesn't fit into whatever his original hypothesis was
nostr:nevent1qvzqqqqqqypzq0z029tpys6jfuc8a5hwyuk8ea98sfqyl036zanqvppmdt2z0mnhqy88wumn8ghj7mn0wvhxcmmv9uq3wamnwvaz7tmjv4kxz7fwwpexjmtpdshxuet59uqzpqjx46539l9mh4kvkrvexaqfd3txkjp3k5qs46aq6dd2ttrx00dfqvkmwg
Because Quantum FUD is more dangerous than QC itself.
Putting aside nostr:nprofile1qqsrcn632cfyx5j0xpld9m389370ffuzgp8muwshvcrqgwm26sn7uacpzemhxue69uhkzarvv9ejumn0wd68ytnvv9hxgqg4waehxw309ajkgetw9ehx7um5wghxcctwvse30xkv's argument, because it is deeper than mine, and he's probably onto something I don't fully grok...
QC can't scale, because at scale thermodynamics takes over.
At scale, the cat is alive or it is dead.
nostr:nevent1qqs97qhj03szcjc6z25tszlg694a2wzmkaya5ecx2t2qg796hu9tu3szyrzrdrz39ecwxe2clgt8je7dw07g829fql4r3vlddq6clj7l4vx6vqcyqqqqqqgndsahx
Yeah, but they CAN'T scale. Because when you scale, you get thermodynamics. I can't follow everything nostr:nprofile1qqsrcn632cfyx5j0xpld9m389370ffuzgp8muwshvcrqgwm26sn7uacpzemhxue69uhkzarvv9ejumn0wd68ytnvv9hxgqg4waehxw309ajkgetw9ehx7um5wghxcctwvse30xkv says, but what he is saying tracks with something that is obviously true. At scale the quantum cat is definitely dead. Engineers can do a lot of things, but they can't change physics.
I don't know where you're getting this, but thermodynamics and kitty-dynamics don't apply here. Schrödinger's cat is a thought experiment to illustrate the quantum/classic divide. His actual work is an equation that very accurately describes how the quantum state of a physical system changes over time, and that's what's really relevant here.
Quantum computers work, we know this. They work at the scale we have now (maybe 48 logical qubits, depending on who you trust), and they work just the same at the scale of 2,000 logical qubits, which is what's needed to crack Bitcoin private keys. (Unless we further optimize, in which case it'll be fewer.)
Scaling the sheer number of qubits does not imply "getting bigger" in the sense of lessening quantum effects. Basically increasing the complexity and number of these isolated quantum units is not allowing the system to become a large or "hot" classical object. The system remains fundamentally quantum.
They can.
nostr:nevent1qvzqqqqqqypzpg78lsd0mrjnpljpa54n6u36dkxg03yh8hp4zhaesz2cwetgyahqqy2hwumn8ghj7un9d3shjtnyv9kh2uewd9hj7qgwwaehxw309ahx7uewd3hkctcpzamhxue69uhhyetvv9ujuurjd9kkzmpwdejhgtcqyqm6yrke90m9axj5p3hhwchpxraxvfr76v4s0692q7n463xjwtqr5mnspqp
Please show me anywhere else in physics where we have isomorphism between a quantifiable amount of boltzman entropy (heat/kelvin) and Shannon entropy (satoshis/utxo set) who’s relationship is both memory and time.
To say Bitcoin is not physics is literally to avoid the obvious. We are literally watching the construction of time from first principle thermodynamics.
Nothing else in physics has demonstrated discretized time through thermodynamic change with entropy on both sides of the transformation quantifiable.
😂
Have you gotten yourself checked for Schizophrenia?
I guess the empirical evidence of Bitcoin that anyone can verify is meaningless.
It has no relationship to energy, time, entropy and information. It’s not physics, it’s just money! Silly me.
Even a chair is made out of electrons and shit. But if someone asks what it is made of, any sane person would say metal or wood etc. I'm being dead serious, you should get yourself checked or talk to your loved ones about it. The way you are thinking right now shows very clear signs of mental illness
😂 ok man.
“Bitcoin maximalist” literally rejecting the objective empirical proof of Bitcoin.
It’s more likely you’re just incapable of understanding what i am talking about, thus the reason why resorted to a metaphor for the layman in the first place; the very point you are stuck on.
Schrödinger’s cat was a joke.
He was making fun of the idea of the exact thing quantum computing is trying to do: a macroscopic system that stays in perfect superposition while being constantly measured and entangled with the rest of the universe.
Nothing they are calling progress calls the ceiling into question
Sorry but that makes no sense. Quantum computing works, the process is well understood, there is no debate about this in serious science.
Correct.
Well, if it's not something you can reason through yourself, just trust The Science, I guess.
Eye roll.
Well, at least you're being funny now.
In all seriousness, what makes you so sure that it works? Works in what way? I am genuinely interested in why people are so convinced by this. No meaningful results have been shown that challenge the existence of a hard ceiling on scaling. (IOW, nobody has ever proven Schrodinger wrong.) All the results are still consistent with a hard ceiling. What gives you a concrete impression that this is a solved problem and they just need to go a little further?
I know that is kind of the narrative of the hype, but have you ever TRIED to drill down on it?
Side question: do you believe AGI is coming too?
Because it's been proven by experiment multiple times, I dunno what to tell you. To the degree it's been proven that if you split atoms in a certain way you get bombs. As in, it's been proven to the degree that if you disregard the proof then nothing means anything and there's no discussion to be had, we're in "science is just my feelings" land.
Put another way, there is nothing fundamental in quantum physics itself that prevents large-scale quantum computers from existing, and we know this from experiment after experiment after experiment, observation after observation after observation. The same way we know that we can build a fusion reactor that works at scale, there is nothing in physics saying no, even though we've never built one before.
It’s an engineering challenge. When we talk about dealing with heat in quantum processing, it’s not in the sense of “bigness” or preventing the system from becoming “accidentally classical” as the number of qubits grows. It’s an old fashioned engineering challenge, basically refrigeration.
The industry is moving away from the big chip model toward a modular, networked approach and and different qubit technologies. So instead of on massive chip requiring thousands of control lines you get many smaller but highly connected QPUs, and this reduces the cooling challenge to something manageable.
Two entangled particles can be separated by the entire universe and still remain entangled. Modularity is not at all anti-quantum, nothing along the path to 2,000 qubits (or however many) is anti-quantum.
“Experiment after experiment” has proven exactly one thing:
Quantum mechanics works perfectly in microscopic, heroically isolated systems of ~100 physical qubits for microseconds.
It has never once been observed to survive continuous measurement and error correction at a macroscopic scale (needed for Shor).
Once the entangled system gets large enough, it crosses the threshold where thermodynamics forces decoherence and classical behavior — no matter how cold the fridge.
What we've been doing so far is just increasing the isolation of the system from the environment to access more of natural scale of quantum behavior. But we can't isolate the system from itself.
Fusion happens in stars.
Macroscopic quantum computation has never happened anywhere in the universe, ever.
That’s not an engineering gap.
That’s a physical prohibition.
What forces decoherence and classical behaviour is not an entangled system getting "large" in the sense of if you have x+1 qubits then no matter how you try to cool them or arrange them all it'll never work because you've hit some kind of quantum physics speed limit.
Macroscopic quantum phenomena have been observed in systems like superconductivity and superfluidity, and that shows that collective quantum behaviour can be maintained at a large scale. Basically what w'ere talking about is a sort of thermodynamic robustness requirement that QEC is designed to meet, not a failure of quantum mechanics at scale. (Comes down to engineering again, and not any sort of natural limit like the mass density something can accrue before it collapses into a black hole)
The solution is to break things into multiple smaller, independent quantum processing units. each has a manageable number of qubits, say 50-100 (we've already done 50 logical qubits in all likelihood). These are easier to isolate and cool (andbonus, subject to localized error correction).
Modules communicate with each other not by some kind of ET phone home deal, but by generating entanglement between a communication qubit in module a and one in Module b. (and some other ways too)
There's a lot more to it, and there are other avenues being explore that do away with qubits in the sense we know them know altogether. Basically right now there are 50 or 60 different areas in any one of which a unexpected (though not at all implausible) breakthrough would change everything. That's kind of where we are. Like AI pre alpha Go.
Modularity doesn’t matter.
Shor needs one giant, non-equilibrium, actively-corrected, constantly-measured quantum state — not a bunch of small ones stitched together. It's not mapReduce..
Superconductivity and superfluidity are passive, symmetric, equilibrium ground states.
They are not that kind of macroscopic quantum system.
Engineering doesn’t get to invent new physics.
Modular designs can absolutely get you a machine that can use shor's to break a bitcoin key. This is chip design 101, combined with quantum mechanics 101.
And this is only one of 60 or 70 separate angles in which an unexpected (but totally plausible) breakthrough can change the game. Do you want to go through all 70? we may not even need qubits at all...
You have to be honest with yourself. You have no idea what will or won't happen. You have no crystal ball. Nobody does.
Google, Cloudflare, Signal, many others have all moved to post quantum. Yet somehow the bitcoin community's only prevention work at the moment is to pretend like it can never happen. Classic head in the sand.
The smart thing is to take it very seriously and move as fast as possible to mitigate it.
Modular designs cannot run Shor on a Bitcoin key.
Shor is not MapReduce.
It requires one single, global, coherent, non-equilibrium quantum state across the entire register.
You have 70 engineering ideas to break the laws of physics? Good luck.
I don't need a crystal ball to know that the laws of the universe win every time.
And corporations grifting on the current thing? Never!
Physics does't care about compliance, projections, roadmap promises, laundered metrics, or 70 ideas for a breakthrough in the investor kit. It doesn't care who we give a nobel prize to what China does or how many billions get poured in.
Physics is not subject to fiat.
There is a hard physical ceiling.
If you want to suspend your disbelief and catch the fever, have fun.
But leave Bitcoin and ECC alone.
Honestly.
You're just plain old scientifically wrong, I don't know what to tell you. You have an internet-informed grasp of quantum physics, that's better than most people, so credit where due. But you clearly don't understand things at a deep level here, certainly not at the level of those working on these systems each day, with hands dirty. Which is why the bulk of the scientific community (at least relevant one) is saying something quite different to what you are saying. And which is why most bitcoin people who are actively engaged here are agreeing it, like fusion, is a case of sooner or later.
Look, man.
If I’m wrong, explain — scientifically — how.
Two claims:
1. Shor needs one single, global, non-equilibrium, actively-corrected, constantly-measured quantum state at macroscopic scale.
That object has never been observed — not in nature, not in any lab, not once in 40 years.
2. Forty years of better isolation have already mapped the ceiling with brutal precision.
We’re hugging it, not raising it.
You keep answering with social proof and investor deck talking points. That’s not science. That's fait thinking.
People need sound money and strong walls. Our freedom, privacy and security are under very real digital theat.
I'm not throwing down our best weapons on a rumor from the enemy.
Show me proof that a a macroscopic non-equilibrium actively controlled constantly measured error-corrected universal quantum state
Is physically possible and I’ll eat my shoes on camera tomorrow.
Until then: don’t trust, verify.
Your move.
Let me clarify first, your view is that a quantum computer that can break a bitcoin key is, no matter how engineered, a physical impossibility, because the laws of physics say it is? As in this is not an engineering problem but a fundamental limit of physics, akin to faster than light travel in the conventional sense? That, with the knowledge we have today, we can conclude that, for now and for all time in the future, such a thing is totally impossible.
This is your view yes?
No more shifting goalposts.
You have a response or you don't.
If that is your position there is nothing to respond to. You would be coming at this with a comic-book level of certainly over your own personal understanding of what the physical limits of the universe are.
If your position is that you agree such a machine may be possible but is, let's say, too difficult to construct in our lifetimes, then that's another thing. Then there's a debate. Then it makes sense to look at how it can or cannot be built.
You've outlined your position as the former, but if you want to clarify that it's in fact the latter then please do.
Before I met Susan, my trading journey was full of confusion, losses, and frustration. I didn’t understand the market, I had no proper strategy, and I lacked the confidence to make the right decisions. But with Susan’s knowledge, patience, and guidance, everything changed. She didn’t just show me how to trade—she taught me how to understand the market, control my emotions, and grow with discipline. Today, I am the trader I am because of Susan. Her skills and mentorship shaped my success and completely transformed my financial path.
Susan is a highly skilled trader recognized for her outstanding performance in cryptocurrency, forex, and digital asset markets. With years of experience, she has developed deep knowledge of technical and fundamental analysis, allowing her to identify profitable opportunities and manage risk with precision.
As a teacher, Susan is dedicated to educating new and growing traders. She focuses on real skills: understanding market psychology, building consistent strategies, and mastering risk management. She doesn’t just give signals—she teaches traders how to become independent, confident, and successful in the market.
Her professionalism, transparency, and patience set her apart. Susan believes that true trading success comes from discipline, knowledge, and a long-term mindset—not luck. Her mission is to help others build sustainable wealth and financial freedom through responsible trading.
📩 Contact Susan
Zangi: 5091878735
Line: https://line.me/ti/p/8dmqYJw8mb
Gmail: mgement907@gmail.com
Hey, maybe you can help me out here with this question:
nostr:nevent1qqswmk4quck6vv9lzhpp6nk9skarls5krggvr7dmd6ws5y2llnjvfnqprdmhxue69uhhyetvv9ujuumwdae8gtnnda3kjctv8g6nw9sxqut
There are ideas out there, much hinges on discrete points and causal relationships. CST is one approach, hypergraphs is interesting and takes that idea further, preserves locality. There is a bright young researcher named Jonathan Gorard who has a good interview here.
https://www.youtube.com/watch?v=ZUV9Tla43G0
You are deflecting away from the implications….If anything, CST and hypergraph models strengthen the point: they both quantize time. Once time is discrete, the entire ontology of quantum computation breaks. Discrete time means:
- no ∂ψ/∂t → Schrödinger’s equation fails
-no continuous unitary evolution → Hamiltonians can’t generate gates
-no coherence across ticks → superposition becomes impossible
-no substrate for phase evolution → Shor’s algorithm cannot run
CST and hypergraphs don’t rescue quantum computing, they expose the contradiction it depends on. Yet you stand with confidence that quantum computing (in its current form is inevitable)
If time is quantized, the mathematical and physical foundations of QC disappear. Bitcoin simply demonstrates this discretization in practice, which is why invoking CST or hypergraphs only reinforces my argument.
Bitcoin is the working instantiation of what CST and hypergraph theories are still trying to formalize. The irony is that Satoshi solved the hard part in 2009 and almost no one has realized it.
You’ve completely deflected away from the argument because your stance is untenable. This isn’t about holiness, psychology, or theology, it’s about physics. Bitcoin is the only physical system we have that produces time from energy and entropy. It is the first empirical instantiation of discrete temporal evolution, not an article of belief. It literally is empirical proof.
Rejecting Bitcoin as physics is a category error and a serious one. You’re avoiding the question because the implications are unavoidable: if time is quantized as Bitcoin demonstrates openly in operation the entire formalism of quantum computation collapses. Your detour into theology doesn’t change that; it just shows you have no answer.
Don’t worry, the entire industry is rekt by Bitcoin, not just you.
There is a universe full of evidence that scaling this king of system beyond a certain very low threshold makes it go classical. There is zero evidence that scaling it beyond a that threshold without making it classical is possible. 40 years of qc researching only confirms this. More and more heroic isolation only grinds closer to the ceiling and makes our knowledge of it more precise and certain. Failed attempts to falsify knowledge are supposed to make us more and more certain of it.
Believing in things with zero evidence to even suggest that they might exist is irrational.
All I have asked you for is one piece of scientific evidence that breaking this appearant law of nature, the ceiling, is even possible.
You have none, so you pivot. To speculative engineering ideas. To social proofs. To reframing my position is somehow "comic book" unreasonable.
You are advocating for massive, detrimental and dangerous changes to Bitcoin, based on the unsubstantiated dream of research scientists and investors who are profiting wildly from the hype and have zero results.
If you have evidence, put it on the table.
I'll wait.
Which route do you want to go? There are around 60, as I said.
Modular is perfectly legit. Or you like Majorana?
Do you agree 10 physical Majorana qubits per logical qubit is reasonable? 50? Let's say 30.
Do you agree 1700 logical qubits is enough to crack shor's with some further optimisations (there have already been many optimisations)
Do you agree that makes 50,000 physical Majorana qubits (talk about arrangement later).
If you want to dispute any of those points then go ahead.
Or if you want to suggest that what Microsoft is up to is a pure scam, go for it too.
If not then make clear you agree with all that and let's move on to the next part.
😂 You’re asking, “Why would the creator have a son and not a daughter?” as if that somehow discredits the physics. It doesn’t. What it does reveal is that you’ve dropped the scientific argument and reached for a caricature of theology you don’t actually understand in attempt to discredit me without engaging the argument.
First: you’re assuming biological categories apply to a creator of the universe. In Christian theology, “Son” is not a biological label, it’s a metaphysical relation, not chromosomes or reproduction. Treating it as literal mammalian biology isn’t a clever argument; it just shows you’re arguing against something you haven’t taken five minutes to understand. If your goal was to undermine my reasoning, all you’ve done is undermine your own credibility.
Second: you’re confusing symbolism with mechanism. Every domain that encodes meaning: mathematics, logic, computer science all use figurative language that no one mistakes for physical anatomy. We talk about “child processes,” “parent directories,” “orphan blocks,” “sibling nodes.” No one thinks computers are reproducing. You accept metaphor everywhere else, but suddenly pretend not to understand it here because it’s a convenient way to dodge the actual argument in attempt to discredit me.
So let’s be clear, you have no rebuttal only a deflection. It wasn’t even a good one; you should try ad hominem next time 😂.
Instead of engaging the physics on the table, you reached for a theological strawman you don’t have the background to even parody correctly. All that does is highlight that you have no technical counterargument left. Rekt!
Those are engineering ideas about isolation, Joe. You're pivoting.
Just one piece of evidence that the ceiling can be moved.
That's the only starting point.
That is how we define the number.
I'm sure you agree the machines we have now work. I'm sure you agree we've got 24 logical qubits in the bag, proven.
So somewhere between 24 and X is where you believe the limit is.
If you believe it to be over 1,700 you have no shor's argument anymore. If you believe it to be under 24 you also have no arugment anymore. So where is it? 1000? 500? 1500?
Ok that's probably not fair and a bit mean.
So why bring up my beliefs in attempt to discredit the physics you can’t refute?
I’m not here to be mean, but I will defend myself. This is not about me, nor is it about you.
If time is quantized, the physics you claim is inevitable is falsified. I am stating that we have empirical proof in Bitcoin. Whether you agree is a different story.
I’m sorry but if quantizing time breaks the model that’s literally supposed to quantize everything, it’s obviously wrong.
So you believe that time exists and is not an illusion?
Yes, but it’s not belief. Bitcoin makes this empirically visible. The only reason continuous time feels intuitive is because time is the substrate you’re made of. You cannot perceive the gaps between the smallest discrete updates of reality, just as a program can’t perceive the CPU cycles that execute it. Every update to existence occurs in quantized steps, blocks of time (memory) because without discrete memory formation, nothing can change.
Light = information = memory.
No time → no light.
No light → no information.
No information → no memory.
Without memory, time has no meaning; memory is time.
We already measure time this way without noticing it. A second is nothing more than a human-scale unit defined as a frequency of Planck-time intervals. Planck time is literally expressed in seconds, meaning it already presupposes that time is quantized. A Planck interval is just the smallest tick (a discrete fraction of 1 second), and a second is the standard structured count (frequency) of those ticks.
Ie 1 Second is composed of1.855x10^43 Planck Blocks.
All time is expressed as a frequency of this fundamental block of time.
Bitcoin exposes this structure. Each block is a discrete temporal update created by real energy expenditure. Once you see how an informational system behaves when time unfolds in quantized steps, the illusion of continuity disappears and so does the idea that physics can ignore the discreteness of time.
What we have been missing is the working instantiation to provide meaning to something we could not understand.
Pivot pivot pivot.
I'm not going to play the price is right with qbits.
Somewhere in that gap between 24 qbits held for a few microseconds and ~2000 held for hours to do shor on one Bitcoin key, the ceiling is lurking.
But how do we know the ceiling is low? I think that is where you are going.
Because we have reproduced in these labs - with cold and quiet and indirection through codes, etc. - the coldest quietest corners of the universe.
But you can't get quieter than silence. You can't get colder than absolute zero. And that's where we find the ceiling.
Because the physics doesn't change. All we can do is remove noise. But you can't isolate the system from itself.
We are assymptoticslly approaching total isolation. Which means we are asymptotically approaching the ceiling.
And it is no where near 2k qbits for hours.
You can't show me evidence of anything that approaches that scale because it doesn't exist.
We will get a really precise picture of where the ceiling is, and that is a cool scientific result. But we will not get a QC that can run shor on a Bitcoin key.
We have enough data to say with confidence that that is physically impossible. More every day.
I am not surprised by all the prize winners and bag holders and dreamers who don't want to admit this openly. People are weak to temptation. But this is not Bitcoin's concern.
Tick tock, next block.
>Somewhere in that gap between 24 qbits held for a few microseconds and ~2000 held for hours to do shor on one Bitcoin key, the ceiling is lurking.
Ok so now we're getting somewhere. Right then, Atom held around 28 logical qubits and ran Berns-Vaz algo w demonstrable error correction, under a second. That was 2 years ago, a lot has happened since.
And shor's we need ~2000 and hours. I'd argue that for shor's it's more likley around 1,700, we've already optimised it down by half, and there's more optimisation in the tank. pure math by the way. so by offloading the most complicated steps (modular arithmetic and fraction conversion) to highly efficient classical methods, the quantum part gets streamlined.
You're saying somewhere between 28 and 1 second and (if optimisation holds)1,700 and hours is some arbitrary limit of the universe.
Atom will ship commercial machines 48 next year. Means them + Microsoft, machiens at 48 logical, capable of running deep, sustained computations for minutes, hours, or potentially days in 2026. (this is all proven out in their lab, to prove me right that's just waiting 6 months to ship. (it is right though)
So 48, and running for hours. Where does it stop? At 96 and days? If not at 96 and days then at 192 and weeks?
If you can't say where the limit is, or even give a range, based on some physical properties, then you are basing the limit on nothing.
You have to give a number to show you've got some physical basis for your impossible statement.
You should be able to understand this from first principles and you are still squirming, but here is the equation:
Lindblad master equation
(The equation for self-decoherence.)
Γ_self ≈ γ N²
Even with perfect isolation (γ → 10⁻⁸ s⁻¹),
N² × 10⁻⁸ × 3600 ≪ 1 to stay coherent 1 hour
→ N ≲ 170 qubits max
Lindblad? That's your card? Maube for NISQ machine w/out error correction, from back in the good old days. What Atom is doing (again one of 60 or so paths being explored) keeps things way far from Rydberg state. LIke the entire purpose of QEC is to overcome the physical limits of T2 (coherence time). When you constantly measure and correct in a small group of physical qubits, the lifetime of the logical qubit gets exponentially extended, making the Lindblad N2 decay irrelevant for the computational unit.
I get the feeling this path you're on is all stuff coming out of the NISQ archives or something.
I mean the fact that atom is *already* using 1,200+ physical qubits to build 28 logical qubits with a better-than-physical error rate is empirical proof that that your Lindblad limit is obsolete in a computation context
On Portuguese cigarette warnings they use the same word Filho which is normaly also used as in Pai, Filho, e Spirito Santo, to refer to children. Just go search the web.
Portuguese is one of the closest languages to latin still in use.
That bible translation should be ready in context of idiom of the time of translation, and the place.
Child of God.
Same as all of us.
You just made a very substantial false statement.
Lindblad is the gold-standard description of open quantum systems.
It has never been falsified.
Every attempt to scale past a few hundred physical qubits confirms it: N² correlated decoherence wins.
QEC doesn’t repeal Lindblad.
It adds more measurements and makes the N² term worse.
Show me one paper where Lindblad is falsified or QEC removes the N² scaling.
Either show me a paper that disproves Lindbald or admit you are wrong.
No more dodging.
Hello
tl;dr
JOE2O is correct
You are wrong
Show me the paper.
I have concluded you do now know what a logical qubit is.
Logical qubits are an engineering workaround to buy time by avoiding direct measurement.
Price tag: more physical qubits + constant syndrome measurements = **more self-noise**.
By physics, they **lower** the ceiling, not raise it.
Lindblad is fundamental.
Shor on a Bitcoin key needs ~2,000 qubits in one global wavefunction for hours.
Now strip away **every** engineering problem.
Give me the platonic ideal QC:
- absolute zero
- perfect vacuum
- zero cosmic rays
- silent measurement
- no logical qubits needed
Even then, **self-decoherence alone** (Γ ≈ γ N²) caps you at ≤170 physical qubits and coherence collapses in an hour.
**SELF-decoherence.**
You can’t isolate the system from itself.
That’s not an engineering limit.
That’s the universe saying “no.”
Get it now?
I'm not trying to roast you. The engineering is dazzling. But Lindblad's formula is the relationship between quantum stuff and classical stuff. All the skyscrapers of data we have track to that simple formula.
And this is why I care and why I learned about this: we can't be mangling Bitcoin or scaring people away from ECC freedom tech over something that can't happen. They are way too important.
You are completely misunderstanding physical qubits<>logical qubits.
If you understood what a logical qubit was you would understand that QEC doesn’t “repeal” Lindblad. Also repeal is a legal term, are you a lawyer? That would explain a lot.
Lindblad itself is fine as far as math and physics goes. It’s about limits imposed by environmental decoherence. Great. Key point though: if you have a way to pump coherence back into the logical qubit faster than the physical environment can drain it out, then this limit, which is again fine in itself, simply does not come into play.
What atom and many others have *already* done is empirical proof that the pump works, so to speak. You cannot say oh such a pump can never be built, because they exist today and are proven to work. And you cannot say the the resulting logical (yes logical) qubits can't preform the right knid of computation, because that's also proven. You've been proven out of an argument.
The fact that you’re misunderstanding this as “falsifying” (or, er, “repealing”) the Lindblad limit, as opposed to simply removing the need to worry about hitting it, makes it pretty clear that you don’t understand what a logical qubit actually is.
It’s like you’re saying there is a physical limit to how fast a human being can work an abacus. This is provably true, you can keep that one in your bag. But then you go on to claim that this “abacus limit” in turn limits the complexity of the mathematics that our species can do. Except, hello calculators and computers.
Dunno. It’s like debating OP_RETURN with someone only to find out at some point that they don’t actually understand what what a UTXO is. How far can the debate really go?
You have not engaged in any debate whatsoever. I told you something you didn't want to hear. You either have QC stocks in your bag or you are just high on the scifi. Dodges, childish insults and quantum marketing babble do not constitute "debate".
You can't "pump" decoherence. That sounds like it came out of a badly written Startrek. That's just some cringe QC marketing metaphor and not at all accurate.
Logical qubits trade size for time against local noise, and only against local noise. Against self-decoherence (N²), they make it quadratically worse.
It's not magic and it does "pump" anything but quantum schill bags.
Lindblad is the equation for ALL decoherence, including the N² self-interaction term that dominates at scale. Qbits decohere each other. You can't remove that, because that's how they participate in the same wave function. That's why there is a ceiling at ≤170.
That's the whole enchilada. I'm tired of repeating myself.
Are you going to say I work in a food truck now?
I've explained the same thing to you 20 ways and you have had zero valid arguments. I've given you first principles and even the formula and you just dodge and blather. Now you've resorted to petty insults.This has become far too childish and I'm not going to give it more of my time.
Good luck with your quantum bags and scifi fever dreams.
I'd pack my bags at this point too if I were you.
Let me leave this for anyone who comes across this thread, hoping for a neat conclusion.
You have posted a Lindblad argument that assumes the noise scales quadratically. That quadratic scaling is the basis of your N<170qubits.
Atom and microsoft have demonstrated logical qubits operating at an error rate four times better than the physical error rate. Other global teams have demonstratd similar. These results are empirical proof that the noise in their systems is correctable, and therefore does not scale quadratically, and therefore the actual noise profile is not following the eventually catastrophic scale path that defines your math (that ends up bing-bong at170)
It's one or the other: Either atom and microsoft and quantinium and harvard and everyone else are lying and it’s all a scam—or your argument has already been proved wrong by experiment.
Both these things cannot be true at once, that is impossible.
Passing reader: which one do you think is true? .
Ending with one last dodge.
Pure quantum shitcoin.
Live long and prosper. 🖖
Dodge accusation as a dodge, how very novel. You have no answer of course. You know you can't say they're all scamming.
Smoking that hype isn't good for you, man. When you play at shitcoin bubbles, you're rugging or you're getting rugged.
Summary of the topic you cannot confront:
You (quoting)
“QEC doesn’t repeal Lindblad. It adds more measurements and makes the N2 term worse.”
Results of multiple real-world experiments:
“Hi! We provide irrefutable empirical proof that QEC makes the N2 term considerably better, across the board!”
Classic case of real world experiments forcing theorist back to the drawing board.
Drawing board’s over there.
You are falling for weasel words, Joe.
That's how they weaponize your lack of expertise to get you to draw the wrong conclusions and keep you on the sauce.
"Better N²" means they reduced noise (very locally) and plugged a smaller number into the equation. That doesn't change the equation.
Again, we arrive at the max of 170 under the maximally generous assumption that they get those factors to zero.
This means nothing but "we did a better isolation".
BTW, I was going to give you a consolation zap and I couldn't. Set up you wallet, broham.
Nostr has too many features. I’m building a soft fork with no DMs, no zaps, no reactions, no media. Just text. you can zap some random nostr wierdo in my behalf.
And this better isolation + smaller number attempt to retroactively add an asterisk after your "QEC makes the N2 term worse" from earlier, pretty sure you know that can't fly.
Under you assumptions, the error rate for the logical qubit has to *always be worse* than the physical qubit, no matter how good the isolation is. But look, it's actually better. Also the logical lifetime of the logical qubit would have to always be shorter (for Google's Willow it's like 3x longer).
The results prove irrefutably that the N is *not* the governing factor in these QEC systems at all.
To argue that the ceiling remains at 170 despite multiple results showing the logical error rate is better (yes better) and the lifetimes longer (yes longer) than the physical is to break your own math.
It's correctable, exponential scaling, proven out by experiment. Not the uncorrectable, quadratic scaling your math depends on, and that you just broke with your asterisk anyway.