Guys, even if everything in this paper is true, the material as it is might have limited applications.
From what they show, the critical field and critical current seem very low. 2500 Oe is like 0.25 Tesla. Even REBCO at 77K is >1T. And 2500 Oe is not even at critical temperature but much lower. From skimming through the article I couldn't find the sample size of the current measurement to get the critical current density, not just current which is meaningless (and around 300 mA).
This means you can't actually push big current through this thing (yet). You can't make a powerful magnet, and you can't make viable power lines, both applications that were the hallmark of "room temperature superconductor revolution".
Of course, maybe one or a few more tweak(s) of the material and boom, it will give high J_c and B_c. I really hope it does, it would be super cool!
You can improve the current density a lot if you can make a single crystal, or at least make your crystal grains larger.
Impure superconductor samples often come out as a spongy mixture of superconducting and non-superconducting bits, the critical current is limited because less then 25% of the cross section is actually carrying current. When I was DIYing YBCO this is what happened most of the time. Every now and then you would get a good one.
edit: Actually if you look at the sample picture in the paper on page 7, it looks like spongy crap. Nobel prize winning spongy crap, but still. I would expect the numbers to improve as better crystal growing methods are found.
Yeah, it’s been a pretty solid trend line, new superconductor, first batch is usually kinda shitty, but proves the basics, refining the mix nails down its exact performance characteristics.
It’s spongy grainy crap indeed… but as long as their analysis holds up to scrutiny and replication… and whoa boy do I bet there are people already trying to replicate this result as I’m typing my reply and reading the rest of the comments…. As long as the results hold up and this isn’t an abnormally low performing superconductor… i have no doubt this is going to win a Nobel prize. This has been the prize for a long time in this whole discipline, and they may have finally nailed it.
We as a species may be on the bring of a revolutionary step forward in what we can achieve in engineering and science. Better instruments and more powerful or sophisticated motors and power systems. It’s heady stuff to think it may happen in my lifetime.
As someone who doesn't follow superconductors, what sorts of things about life/engineering/society would change, how dramatic would it be, and more importantly: which stocks would you pick :)
The direct impact could be relatively limited, at least for a while. Having a room-temperature superconductor is really awesome, but existing high-temperature superconductors are fragile and expensive. You can make motors, electromagnets, and power grids with much better efficiency—but that’s not so useful if the parts break when you look at them wrong.
You’ll still get better magnets and sensors, probably. Maybe even get new types of circuitry.
Just for comparison—we use silicon for integrated circuits. Not because it has the best performance, but because it’s convenient, it’s readily available, silicon dioxide is a good insulator, etc.
Sensors could be huge I think, even with a material that's a total dud in terms of the superconductor power revolution. I'd be surprised if there wasn't a range of novel sensor approaches that haven't really been explored due to the practicality threshold of low temperature superconductivity.
Isn't the big win of superconductors that you can build batteries with them? Like, you just pump them full of power that goes round and round forever with no or trivial losses. I always heard that this was why they were interesting.
It is an option, but there are two downsides:
- such a current generates a huge electromagnetic field. So it won't work for a car battery, but may work for grid storage.
- price - there is a limit to how much current you can store, and so far this was the limiting factor - i.e. we don't really care about room temperature superconductivity in this case, but we care about the price of materials to build such batteries
I'm pretty sure you can pick coil geometries that cancel external magnetic fields. There may be some stray fields, but they can be quite modest with tight manufacturing tolerances.
It's an interesting idea worth exploring. The two places where I think feasibility may face challenge is in the energy density gated by critical current density and magnetic field and in raw discharge rate (giant inductors are not known for being able to change their current quickly).
Knowing peak capacity and aging is also tricky since you can't measure critical limits without hitting a quench (a very, very bad scenario). You'll need to maintain healthy margins so you don't have things blowing up on sunny days or after so many charge/discharge cycles.
Back in the 90s or maybe early 2000s, everyone was convinced that silicon was almost dead for high-performance chips like CPUs, and that we'd all be switching to GaAs (gallium arsenide) very soon. Turns out that GaAs wasn't that practical and silicon's limitations could be overcome, so we still use silicon today.
One thing I can imagine is desktop MRI machines, or, at least, much cheaper and less finicky big MRI machines that don’t take days to chill to operating temperature.
Maglevs are also a popular guess - safer, faster, and more power efficient ground mass transport would be a huge thing.
Maybe even magnetic rail space launches.
And, of course, military applications (the last few examples I mentioned involve acceleration of big-ish masses to surreally high velocities, which is popular approach to weaponry).
Those are two I see mentioned the most often (MRI and maglevs). To be honest though, those are great things to improve (especially the MRI), however, the amount of hype in this thread and on the internet tells me there must be more than just improved MRI and better trains....
I think a big part of the reason there's so much hype around this is that room temperatures superconductors have long been a famous, almost legendary undiscovered material in popsci. Theoretically possible, but with no proof that any such material actually exists. But now not only is this paper claiming that such a material exists, but also that they made some, and it's easy to manufacture and works at temperatures well beyond ambient. Seems almost too good to be true!
So in addition to any immediate practical applications there's also this element of cracking a famous long unsolved problem. It'd be like if we discovered definitive proof that P != NP, or a theoretical basis for FTL communications. Even with no immediate practical applications it'd still be huge news.
> Theoretically possible, but with no proof that any such material actually exists.
I wouldn't say that it was necessarily "theoretically possible," for there has never been, and there still isn't, a grand theory of how any given material's atomic/crystalline structure relates to superconductivity. In other words, with no theory of material superconductivity, it was never quite clear what's possible and what isn't. With this new material, though, we might get a lot closer to a working model, if nothing else.
I got into superconductors shortly after the Fukushima disaster, and it was the poster child for superconductors in transmission lines. One of the perks back then was that you could put these reactors that are sensitive to earthquakes and the like out where they are relatively safe from such things. You can also have fewer, higher output power stations, so fewer municipalities have to operate smaller power plants (which tend to be coal or natural gas).
If its economical it will help with losses in the grid over distance. colliders like the LHC and fermi lab could be a lot cheaper to upgrade with room temp superconductors. More reliable too. Would also be a big help fusion reactors. Anything that uses electric motors will also benefit.
The changes introduced by discovery of fire were very gradual. No Internet, no written language at all: any innovation back then could only spread as quickly as humans walked, and would temporarily stop at every significant geographic barrier such as sea.
Compared to the discovery of fire, the changes from room temperature superconductors would be a flash fire.
I totally missed the impact of the internet, you are absolutely correct. Something else that I just noticed buried in the paper is that it can be readily deposited onto a substrate (search for UNIVAC in the paper, page 4). That's an incredibly big deal.
For the idiots in the room (me), could you explain why that’s a big deal? Does that make the material more robust and therefore usable in harsher environments?
We can deposit it on chips, there is lots of copper (relatively speaking) in a chip so this could improve efficiency. Of course the semiconductor (silicon) is intentionally resistive, so waste+heat wouldn't go away entirely.
Note that when we invented the aeroplane, uses weren't immediately obvious.
People would have suggested things like cities in the sky, looking down on things, and traversing marshland easily.
The actual main use for planes has turned out to be fast long distance travel. But we don't actually theoretically need to be up in the air to travel fast or far - in fact, had we never invented the aeroplane, we'd probably have cars or trains by now that moved as fast as present day planes do.
The revolutionary effect of new inventions is often hard to see, particularly for basic science research like superconductors.
Can you? The proposed model implies this relies on non-periodic impurities to achieve superconductivity, and the displayed strength is consistent with superconductivity only appearing on some places spread inside the material. This seems qualitatively different from YBCO.
I don't know enough to be sure on the certainty of that model, but it seems well supported. So I'd be surprised if only improving the material quality was enough to make it strong.
You are probably right about the details, but as a general observation: industrial products tend to get better over time as people learn how to make them more properly. The first batches coming out of a lab are usually terrible compared to what's coming out of factories a few decades down the line.
They've just proven (if true of course) that it's possible at all. That is a massive, massive leap.
And once it's possible, it won't be long until it's optimized. We've seen this everywhere -- transistors were once huge and now nanometers; solar cells have improved in every where; batteries are cheaper and better than ever.
I was with you on the first part. If this proves room-temperature/ambient-pressure is possible at all, that is huge.
Not so sure about the "won't be long until it's optimized," though. There are a lot of examples where something seems perpetually 20 years away. I'd advise tempering the transistor-based optimism with just a skosh of fusion energy skepticism.
Compared to fusion it should be a lot simpler and if there is one material that exhbits these properties there might be more. I'm really hoping this isn't a scam and that there isn't some kind of critical error. Regardless of how much more work needs to be done to get this to commercially viable at scale if true I would imagine that massive investment will start chasing that goal.
Just thinking about the possible applications for storage makes me dizzy. Fingers crossed.
>There are a lot of examples where something seems perpetually 20 years away. I'd advise tempering the transistor-based optimism with just a skosh of fusion energy skepticism.
most of the common examples are in-the-works or exist in some form, they just don't satisfy the 'every-person' checkbox yet.
AI? sure. Flying cars? sure. Robots? sure.
Fusion is in the works, too. Tens of billions of dollars being thrown into the ring by private capital -- and recently -- which is a pretty good indicator of 'perceived realistic' historically.
Also, it's kind of apples/oranges. We had equivalent mechanisms before the transistor, transistors just lead to extreme miniaturization of logic gates that we now enjoy. Fusion energy production doesn't (really) have that equivalent.
similarly : room temperature atmospheric pressure superconductors are a new thing if proven possible.
My favorite example is the hydrogen fuel cell. It was invented before the lead acid battery back in the 1800s. PEM gave it a boost around the Apollo era, but that wasn't enough to make it widespread either. Lead-Acid went through its whole 100+ year character arc and hydrogen fuel cells still haven't found product market fit. Sometimes that's how it is.
There are plenty of commercial applications of hydrogen fuel cells though. The biggest issue has been pretty much a constant over the time since it has been invented: keeping the membranes free from impurities is hard.
But there are all kinds of transportation devices using hydrogen in production today.
Sure, but the point is that they hardly caused a revolution in energy storage. If fuel cells didn't exist, for the average person life would be exactly the same.
Revolutionary tech does change normal peoples lives, and sometimes very rapidly, but a lot of stuff that looks revolutionary just kind of never works out.
I'd say nuclear power is probably the prime example of this. It was supposed to bring us electricity too cheap to meter, but it's actually the most expensive form of generation that anyone bothers to build.
> It was supposed to bring us electricity too cheap to meter, but it's actually the most expensive form of generation that anyone bothers to build.
Hardly, coal is much more expensive if you price in the externalities. We just pretend they don't exist for coal, and we staple anything that moves to nuclear.
Coal kills 25 people per TWh generated, and the actuarial cost of a death is about $10M as used by the nuclear industry, not to mention the environmental costs. That means the all-in cost of coal is much much higher than the LCOE - about 16c/kWh according to the Government of Canada. [1] The 2019 US EIA LCOE for nuclear is about 7.7c/kWh. [2]
Nuclear is cheaper than coal and costs around the same as solar/wind + storage - less, depending on the desired level of equivalence between the two. It has high up-front capital costs and a long payback period so the cost depends primarily on cost of capital. Fuel costs are $0.015/kWh to $0.00015/kWh in uranium.
There are places that get lots of cheap, reliable no-carbon power from nuclear. For instance Ontario, at about $0.10CAD/kWh ($0.075USD/kWh), delivered. [3]
Can you give me an example of a nuclear power station that was built entirely without taxpayer subsidy by a private company, and manages to sell electricity to the grid cheaply and make a profit?
> Can you give me an example of a nuclear power station that was built entirely without taxpayer subsidy by a private company, and manages to sell electricity to the grid cheaply and make a profit?
Why is that a goal? Zero-carbon electricity is a goal, so that we don't all sink. Profit isn't. If ever there were a job for taxpayer dollars, IMO, this is it.
"Yes the planet got destroyed. But for a beautiful moment in time we created a lot of value for shareholders."
Not at all. I quoted the US EIA in that 7.7c/kWh figure on the pre-subsidy LCOE of plants coming online in 2023. It was listed in my [2].
"In 2019 the US EIA revised the levelized cost of electricity from new advanced nuclear power plants going online in 2023 to be $0.0775/kWh before government subsidies, using a regulated industry 4.3% cost of capital (WACC - pre-tax 6.6%) over a 30-year cost recovery period" [old 2, sourced from new 1]
The response to you was a separate opinion on the role of government in the energy sector. After all, we've put trillions of subsidies into fossil fuels the least we could do is put money into something that solves problems instead of creating new ones. I also said I don't care whether it's public or private funds that are used to construct it.
Really nothing at that scale is built without subsidies, but the 7.7c/kWh rate was before subsidies.
Did I miss something? I thought I answered your question with that figure and its origin, it sounds like exactly what you wanted to know. I don't know which precise power plants they included but it's probably in [1].
[edit] I guess it's weird that we're so stuck on this one specific technology. What's the un-subsidized cost of oil power when the price of oil is set by OPEC, externalities aren't factored in, and every time it goes up we unload the Strategic Oil Reserve? How do you price in drilling in ANWR?
Coal, I mean, it's all unpriced externalities - death, environmental toll.
Solar 90% of the panels come from China, how much does the PRC subsidize the plants and materials that go into making the panels for their own geopolitical goals? Rare earths for wind? They all come from China too. Lithium for storage?
My question is sort of more "can anyone name any utility scale power project of any type that was built by a private entity without government subsidies of any sort? Why would they do that if they didn't have to? What does that even mean? And really, does that even matter?"
The reason is pretty obvious and doesn't require any kind of conspiracy.
Nuclear can be done safely, we know this, because every single time there has been an accident it's because an operator did something wrong.
The problem is that nobody has yet designed a reactor that a sufficiently amoral operator could not make unsafe. Even if you have completely automatic and passive safety features, a bad operator could disable them if a false positive happens even once and costs them money.
For this reason nuclear has a LOT of regulation and red tape. It has far more than any other kind of energy, because even though the risk of accident is low the outcome of an accident is worse than any other kind of energy except hydro. Hydro has fewer things that can go wrong that are cheaper to check however, so regulations there tend to not be as expensive.
Molten salt reactors manage to be expensive for perfectly normal engineering reasons though. The salt is highly corrosive, meaning you need tons and tons of extremely expensive piping that can withstand extremely high temperatures and extremely corrosive environments.
There's a reason none of _those_ have been built either. No amount of red tape would make them unviably expensive if the end product was cheap enough to run, but it isn't.
Also I should mention, part of what makes them passively safe is there is a plug at the bottom that melts if the reactor overheats. This is easy to bypass, put something over the plug that won't melt.
> There's a reason none of _those_ have been built either.
Let me introduce you to the Molten-Salt Reactor Experiment[1].
What you probably meant is that none have been built commercially. That is true, but again as I mentioned, not because of their technical drawbacks but because of politics. In fact, the inventor of the light water reactor, Alvin Weinberg[2], was a strong proponent of the molten salt reactor over his own invention. So strong that he fired was from ORNL because he was claiming that light water reactors are inherently unsafe and that MSR is a better design.
Nixon ultimately sacked him because he (Nixon) chose to support LMFBR (Liquid Metal Fast Breeder Reactor) because it was being built in California, and in return he got political support that he needed. MSR ultimately lost due to pork-barrelling.
> This is easy to bypass, put something over the plug that won't melt.
I mean you're shifting goalposts here. The "operator" has a specific meaning - someone controlling the reactor from the control room. They don't have access to the freeze plug during normal reactor operation.
But even if they did do what you're suggesting, the pressures inside the MSR are so low (on the order of couple of bars) that the damage would be quite limited.
We had a working reactor in the 1960s, but we chose not to develop it commercially due to our inability to choose rationally. That's why we are where we are now.
It's part of a larger trend of eroding competence at civic infrastructure scale construction, but also specific to nuclear we've found repeatedly that construction and decommissioning costs and schedules were wildly optimistic.
Plutonium-239 (just an example) has a halflife of 24,000 years, ground water is a thing and it moves a lot. Modern concrete is both water permeable and has a tendency to degrade aggressively in wet environments over alarmingly short periods of time. Try again?
Fuel cells are just another transformer like the dynamo, or any kind of motor. They turn one kind of energy into another and in the context of hydrogen (which you could produce out of water using electricity as a means of storing energy) it serves to reverse the storage step. This is nice to have but just like nuclear power it's a variation on stuff that we already have, it is at best a quantitative change (and hopefully an improvement).
Room temperature / ambient pressure super conductivity is something we do not currently have. The difference between having that and not having that is a qualitative difference and hence it will enable a whole raft of applications for which we currently do not have a solution.
> By that logic a super conductor is just a variation of a conductor.
No it is not. The difference between 0.1 and 0.0 can't be expressed in orders of magnitude.
> Nuclear power is more distinct from burning coal than any superconductor is distinct from copper wire.
Nuclear power is an incredible invention. Unfortunately it has some problems that won't go away by wishing it to be so, and there are many similarities with coal (as well as some obviously differences).
But this thread isn't about coal vs nuclear.
Frankly, if you don't actually see the difference between the relative importance of superconductors vs copper wire and nuclear vs coal then I really don't think I have anything to say that will interest you. Suffice to say that nuclear didn't change the world all that much (except in a weapons sense) but superconductors at room temperature and ambient pressure have the potential to change the world in ways that would be hard to even imagine. Even if true I still don't think it would be in time to help us address some of the more urgent problems we are facing. Neither does nuclear. And come to think of it: if this tech is real (big if) then it will actually probably cause a revolution in nuclear as well because it would allow for nuclear power to be transmitted the world over without the non-proliferation headaches associated with shipping reactors to various countries. It wouldn't solve the waste problem (though there are some interesting reactor designs now) and it won't happen overnight but it would make a difference.
> superconductors ... have the potential to change the world in ways that would be hard to even imagine.
Such as? I do not see how superconductors help with the challenges of Climate Change, food insecurity or danger of nuclear war. What will be the change for the average Joe? Maybe a better electric car?
Energy storage and distribution alone can affect all three of your examples and they are very trivial ones. Then there is medicine, possibly a better shot at fusion and so on. Better electric cars are at the bottom of my wish list (because they're still cars). But yes, those too (much lighter but more powerful motors means you can do away with the drive train completely and it would also allow you to get rid of the brakes) assuming the superconductivity can be maintained in strong magnetic fields (not a given).
okay, could you be more spesific - what kind of improvement can we expect in storage? Is the improvement measurable - like will I be able to buy more kilowathours of storage per dollar?
You'll be able to buy more KWh of storage per dollar (but likely not initially) and it will be relatively dense compared to current - pun intended - options and very likely have a much higher number of cycles (because there is no chemical cycle, just electron movement). The 'if only we had a room temperature superconductor' list of inventions that got temporarily shelved is longer than my arm, the words 'game changer' were never more applicable. If it is replicated. If it can manufactured competitively. If it (or a variation) works at higher current densities.
Nuclear power and coal power are both heat engines turning heat into mechanical energy and subsequently electricity. They merely use a different heat source.
They share a lot of qualities because of that. They are relatively centralized and best run in a base-load rather than a load-following mode to reduce mechanical stress and increase longevity.
>Nuclear power and coal power are both heat engines turning heat into mechanical energy and subsequently electricity. They merely use a different heat source.
Not completely true. There are some experimental nuclear reactors that convert nuclear energy directly to electricity without the heat cycle, such as [Helion](https://en.wikipedia.org/wiki/Helion_Energy).
When presenting working experimental nuclear reactors Helion is not the company that I would use as my example. They are borderline scammy and given their lack of progress they seem to be stuck in the moving the goalposts phase for a long long time now. I wouldn't bet on them ever completing a working reactor that produces net power.
Maybe, but my point is that it does seem to be possible to generate electricity directly from nuclear reactions, without going through a thermal cycle (making heat, creating steam, using that to turn a turbine). I think there's some other experimental process that promises to do this with fission.
Fusion, specifically, is meant to be the one 'too cheap to meter'. The thought being it'd be ready soon, and grouping both kinds of nuclear together worked as better branding for getting govt. funding.
Well, the other 'biggest issue' is that hydrogen is only a terrible battery.
You need an energy source to make hydrogen (eg out of water, or you make it via fossil fuels etc). When you use up the hydrogen, you get some energy back out. A lot less energy, to be honest.
So it's equivalent to a battery. Not to an energy source.
The bikini swimsuit was named after the Bikini Atoll. A bit of a mixup from OP though: it wasn’t named after fusion bombs. The bikini swimsuit was announced a few days after the first public fission bomb test there (Crossroads Able) in 1946. The first fusion bomb test there (Castle Bravo) wasn’t until 1954.
The sun's core is actually very hot, it is the outer layers of the sun that are much cooler. We're trying to do this at roughly twice the temperature than the core of the sun, and I realize the difference is millions of degrees but on a relative scale this doesn't add much complexity, it would be almost as difficult if the plasma would be only half the temperature that they are shooting for.
And in a way that higher temp is a result of trying to do this at a smaller scale, if you want to be net-positive it gets easier as you get hotter as far as I understand it.
So what we are doing is in fact to re-create conditions roughly on par with what is happening in the core of the sun. And it turns out that doing that small, for extended periods, net positive and reliable (without the machine suffering damage from the process) is a very hard problem. Even so I'm very much impressed with these projects, the engineering and the physics are way over my head but I do hope that one day they'll get it working. But I'm not going to hold my breath.
Incidentally, the implications for energy storage if TFA turns out to be on the money are possibly more interesting than fusion in the short term.
> what we are doing is in fact to re-create conditions roughly on par with what is happening in the core of the sun
My understanding is we are not. (Not an expert!) The Sun's core runs around 15 MK [1]. A tokamak, 150 MK [2]. Orders of magnitude rarely come for free in physics.
We need those higher energies because we can't, like the Sun, swaddle with the mass of a hundred thousand worlds a low-temperature, low-frequency weak-force mediated proton-proton reaction [3]. The Sun relies on quantum tunneling to overcome the Coulomb barrier. We humans have to increase the reaction energy so it doesn't all bleed off before anything happens [4], which means using the strong force [5].
If you start to think of 'temperature' of individual particles as 'speed with which they move' that is a useful rough approximation of trying to figure out what it means that something has a particular temperature. Containing the plasma is hard not just because of the temperature it is at but simply because it tends to destroy anything that contains it and that doesn't really change all that much for 15 million degrees Celsius, 30, 100 or 150. What it does change is that at 150 million degrees Celsius you have some hope of extracting useful work from a very small quantity of plasma. If you don't get it up to those temperatures - again, as far as I understand it - then you will always be putting in more energy than you are gaining because of some fundamental physics limitations.
So the smaller you make your reactor the hotter you'll have to make it to make it net positive. This leads to the counter intuitive result that making a much larger reactor is actually quite possibly easier than making a really small one. The rate of heat loss is much smaller for a larger reactor and so it becomes easier to sustain the reaction and to extract useful energy from it.
It is very well possible that none of the reactors currently on the drawing board and under construction are going to be working well enough to give us a sustained reaction resulting in net yield. But we're getting closer and closer to that and there is some (small) chance that I will still see this in my lifetime.
The catch is that as long as you can't get a small reactor to work getting funding for a much larger one (which you actually may be able to get to work) is going to be extremely difficult. We like to see proof before we scale up. In this case it may well be that such small scale proof can't be done or can't be done in a way that it it will convince backers that a larger scale device will work.
Yes, true. Apropos antimatter, recently I read here on HN somewhere that lightning generates antimatter as well, and it made me wonder if earthquakes do too but I haven't been able to get a clear answer on that. Fascinating stuff.
Larger reactors may simplify the plasma physics, but it complicates the materials engineering significantly. A huge problem already is creating a structure that can bear the weight of the reaction vessels and the gigantic magnetic fields used for containing the plasma, and also continue to do so for a decent amount of time after being exposed to the constant neutron bombardment of D-T fusion.
I very much doubt currently known materials and structures could be used to construct a reactor 10 times the size of ITER.
This is a high value comment - it's got a hook that I sort of mostly get, but follows that up with jargon laden things I don't understand well, and your references gave me a half hour of rabbit-hole learning. I appreciate the casual knowledge you've passed along. Thanks!
It has 4 fairly common ingredients and only needs a couple of days to create. You can be sure that China will make a spreadsheet with all possible combinations of synthesis and farm them out to their Universities for rapid development... and then they start hunting for the 'ideal' version... flexible, fast to synthesize etc.
The Manhattan project and moon landings happened because the US spent a significant % of it's GDP on the project. We might have Fusion already if they repeated it...
LK-99 is just chemistry... not nearly as complicated.
This was something that people said would change the world when I was in high school and it really hasn't. (For that matter, the fundamental physics is still not very well understood)
They rapidly got up to liquid nitrogen temperatures so when I was in high school we would go to the welding supply shop, bring back liquid nitrogen to the lab, and do the Meissner effect demo
Liquid nitrogen is very easy to handle (ordinary thermos), liquid helium is much more expensive and harder. WHen I was in grad school the one required class was the Physics 510 lab and for that I did an experiment that involved second sound in superfluid helium and that involved cooling stuff down with liquid nitrogen first, then rolling up a huge dewar full of liquid helium, attaching a vacuum pump to get the temperature down to 2K, etc. For all that trouble you get to see
Many years ago, as an undergrad, I was telling a grad student friend how I'd been learning about the Selection algorithm- it lets you pick the Kth largest element from an unsorted list in linear time, which is pretty neat.
I said "It's O(n), but the constant is ridiculous in most implementations so it's usually better just to sort and then pick the kth element". The grad student friend said something that stuck with me: "Sure, but the algorithm proves it's possible to find the kth element in linear time. That was never guaranteed. Now we just need to find a better way to do it."
Random conversation that stuck with me, and they probably forgot it a moment later.
Branching off into a philosophical thought here, but I find this to be completely wrong. It was always guaranteed; logic, like physics and chemistry is not an environment that changes.
We have discovered a functioning technique which might be improved upon. What wasn’t guaranteed was that it would be found.
Biology is the root of most, perhaps all, uncertainty. After all, it is our biology that makes us imperfect observers, thinkers, and makers(but also enables us to do those things at all!).
I think this is important, because their is a significant difference in mindset between making something, and looking for something. Science is looking, technology is making. Things are always “seen” before they are “made”.
I remember the first superconductors (long predicted) being announced in the mid '80s. They stayed high on the nerdy headlines for quite a few years. Excitable write ups in New Scientist for us civilians. Nuclear fusion was still 50 years off but room temp superconductors were only a few years off (nope). I went to a posh school in Oxfordshire in the mid to late '80s and my physics class (form) had a field trip to Culham and also a double lesson/lecture done by a handful of Culham physicists back in school. I am very aware of what a privilege that was.
Now I'm 53 and been around the block a bit, I really appreciate how time is required for some things. A lot of time.
Superconductivity was first observed in solid mercury at a temperature of 4.19 Kelvin in 1911, not long after liquid helium was first produced in 1908.
The 1980's discoveries were of the first "high temperature" superconductors (where "high temperature" means "above the boiling point of liquid nitrogen").
Liquid nitrogen is much easier to deal with than liquid helium.
Those mid 80s high temperature superconductors are now mass produced for NMRs and fusion startups.
I don't think it's given that all superconductor breakthroughs will require 40 years to get to that point and there's good reason to believe they won't (startup penalty, industry bootstrapping, market finding, etc. have all been completed).
Those mid 80s high temperature superconductors are now mass produced for NMRs and fusion startups.
As alluded to those same pop science magazines promised a fusion future too. Here we are, magazines extinct, with fusion startups using LN2 superconductors. Also: no quantum computers, no space colonies, no flying cars (or even supersonic planes), and twitter/reddit/facebook are worse than Usenet.
> As alluded to those same pop science magazines promised a fusion future too.
I get it ...
But how many of them predicted their use in ~36,000 advanced medical imagery devices world-wide?
I'd love fusion power (and flying cars), too, but there's a whole lot of interesting technology between "check out my shiny new super-conductor" and "let's use it to contain plasma that's hotter-than-the-core-of-the-sun-kind-of-hot[0]" that we do benefit from[1], today, to not be too disappointed that we haven't quite reached the greatest potentials.
I don't know enough to speak intelligently on any of this -- who knows -- maybe fusion won't be a possibility until even higher-temperature super-conductors are created ... or maybe there's some other "not possible" in the way (until another discovery is made).
[0] And (if I understand things correctly) it's probably really unfortunate that they traditionally require extreme cooling, likely made more complex given the heat involved and almost certainly requiring far more power than would be required if said super-conductors worked at much higher temperatures.
[1] Myself, personally -- and I have a pretty cool 3D file of my brain backed up to my server as a result.
/// apologies: reading this over it sounded a little hostile; that wasn't intended -- I was merely offering a competing perspective, albeit poorly :)
> I don't think it's given that all superconductor breakthroughs will require 40 years to get to that point
Absolutely right. People generally understand that "collective human knowledge[0]" grows but they think of it as a linear system. The speed at which knowledge grows accelerates -- not at an even pace -- but I'd wager somewhere near exponentially in a lot of places.
And each discovery can change our understanding of other things/accelerate discovery in other areas.
Very true. I remember cheering since the mid 80's every time the temperature for superconductivity went up, sometimes with 20 degrees K in one go. And then it was quiet for a long long time with a plateau. More recently, two major jumps, the last one of > 50 degrees (2017, H2S), and now this...
I've forgotten how many times one of the most important thresholds in our times has been bumped up a bit. Now we seem to be offered heaven on a plate.
If superconductance can be reliably demonstrated at RTP (you wear a light cotton shirt, instead of 1cm thick fancy weaves involving an awful lot of rubber) then we are laughing all the way to ameliorating climate change.
Even if this result is confirmed then I think it will take at least 20 years to dig in to reversing climate change.
No, the argument is that if this is true and it can be commercialized cost effectively that there may be an energy storage solution just over the horizon that would allow for all kinds of things that are currently impossible, such as summer/winter energy storage and other very nice to haves. It is obviously much too early to say anything about this so think of it as hope rather than anything more solid but it isn't necessarily about cheaper energy, just an almost perfect companion to cheap renewables whose main issue is that it is hard to store their output for the time when you need it most.
It may also make the energy (and economic) costs of fusion make a lot more sense, since superconductors are used to contain the plasma in fusion reactors.
I believe the biggest envisioned possibility is superconducting power lines. Those would allow long-distance power transfer with minimal losses. In the most idealistic scenario, you could imagine a belt of solar power plants around the globe connected by superconducting power lines, providing solar power 24h a day.
Of course, there are huge hurdles to such a project even if we did have the superconducting lines, but there are more realistic similar applications that might actually work.
>In the most idealistic scenario, you could imagine a belt of solar power plants around the globe connected by superconducting power lines
There's no technical impairment for doing this with current tech, and it's not clear the new tech is cheaper.
Power lines are already very efficient, especially the long-distance ones. We would save a bit on converting from HVDC to AC but that's also very efficient.
The numbers I found on a quick search are 3% loss per 1000 km for HVDC lines. That's a pretty huge loss for the 20k+ km lines you'd need to transmit power from daytime to nighttime areas on the globe.
I thought these were mostly fixed conversion losses, but checking this again you're right, it's 3% per 1000km (not including the conversion loss at the stations).
There absolutely is. Once you think about this on a global scale HVDC doesn't quite cut it. You can extend the day/night cycle by a few hours each way, which is already very impressive but it isn't enough to cover 24 hours and it definitely isn't going to help you in winter when you want to transport energy from the sunny hemisphere to the darker one.
We can easily generate more than enough renewable energy, just not when and where we need it. Being able to transmit energy over vast distances would greatly improve the economics of our existing renewable energy generation solutions.
I'm not sure what the GP envisions but one thing you could use superconducting materials for that don't require a bunch of extra gear to operate is to store energy in loops of it. This may well allow for a relatively high storage density (but probably not nearly as high as chemical ways of storing that energy but with better efficiency, and for stationary applications density is less of a factor than for anything mobile) with near instant charge/discharge times. Kind of like a solid state version of a flywheel.
That in turn could power an energy revolution which has the potential to reduce carbon based fuel consumption dramatically.
That's SF right now, but there are pathways to very interesting futures unlocked by a material such as the one described in the paper, all of them subject to the usual caveats that it's a 'mere matter of engineering' and that it may prove to be far too costly in practice. And it wouldn't reverse climate change but it could help slow down the acceleration of climate change.
Room temperature super conductors don't quench. Or at least, not until they reach the temperature at which they no longer work as superconductors, which for this one is 127 degrees Celsius.
That does make for an interesting failure mode if anything should every cause a small spot on a longer conductor to reach that temperature...
No, at the breach it will just burn up until the arc dissipates. But it will be an impressive fireworks display. A superconductor is in the end just another conductor, it has a well defined current carrying capacity while superconducting and if it stops doing that then that current will suddenly see an increased resistance, how much current is moving through it at the time it failes determines how fast and how violent it burns out when it goes, but it won't be unlike another transmission line failure. Those are still very impressive:
Fusion power require powerful magnets(essentially super conductivity is a requirement). Currently the best contender for commercial fusion is REBCO(see: Commonwealth Fusion Systems).
With cheap fusion power, we can begin pumping that CO2 back into the ground and forget about using fossil fuels for energy entirely.
Reminds me of a story: I was in college physics in fall '89 and our professor was telling us how he and his son spent the summer in Alaska prospecting for whatever material was all the rage in superconductors at the time. He was explaining that once superconductors broke the liquid helium temperature, it was going to be a game changer. He said "If you buy it by the gallon, liquid helium is cheaper than beer."
To which a student replied "You buy beer by the gallon?"
I'd use it for making my own SQUIDs[1], as a start. There are a number of experiments I want to do, and things I want to understand[2][3], and not having to have cryogenics keeping the detectors cold would be helpful.
I'd also like to use this for antennas, transmission lines, and tuned cavities.[4] There are a lot of things you could do at VLF frequencies[5] that require long, long wires... with lots of resistance, unless you have a defense budget, the resistance eats into efficiency. Superconductors could help deal with that.
Just for comparison, an typical MRI magnet is 1.5 Tesla. An NMR spectrometer can go up to 28 Tesla (using new high-temperature superconductors). The LHC magnets are around 8 Tesla.
Those are the kinds of magnetic fields the classic superconductors and the newer high-temperature superconductors can achieve.
> the material as it is might have limited applications
Is LK-99 part of a larger (either known or emerging) class of materials? I'm not understanding what the lead and copper ions are doing to create internal stress, and why that leads to superconductivity.
Pb(2)-phosphate is a crystalline compound. They are creating a 2D film of it using vapor deposition, then doping it with copper ions. This is a standard process in semiconductor manufacturing. There are room temperature ambient pressure materials doped to create quantum wells in production right now. They are not super conductors, because the quantum wells are merely impurities that reduce the resistance of the material. I believe this paper is claiming a crystal so saturated with quantum wells that it conducts primarily through quantum wells with almost no resistance (and that all of the other physical properties of a superconductor arise from this).
Just like in previous super conductor findings once a material is made and understood that usually paves the way to new discoveries, sometimes those are (big) improvements on the status quo. I'd expect this finding - assuming it is true and verified - to result in massive funding towards the material science labs to try to improve on it. So I'd say this is example '1' of a new class of materials and if it holds up then probably we will find more members of that class once the mechanisms are understood.
For some reason, there’s a contingent of people that think that by poking holes and pooh-poohing things, it gives them clout. It happens far too often in tech and I hate it. Look how often the post has “can’t” or “couldn’t”.
Instead of giving reasons why something sucks, how about being supportive and talking about why it’s awesome and what possibilities this opens up?
I don't think OP is being overly negative in relation to the tone of the rest of the comments here. Nobody else up until this comment had mentioned anything about the actual important performance characteristics that the paper's authors' are claiming, and this does put it into perspective with the current state of the art. And OP does even end on an optimistic note anyway. No need to resort to personal attacks.
Edit: I appreciate you toning down the more combative part of your comment.
While I often agree that the tone on these kinds of posts on HN is often annoyingly and unproductively cynical, I think him just pointing out the current limitations of the result is not that much of a problem. It isn't like he's making the overused "perpetually 20 years away" joke about potentially revolutionary technologies.
Superconductivity is not enough for a cpu that doesn't generate heat, you would need a cpu built of reversible logic gate. Thermodynamics requires that when you destroy a bit of information you generate at least 2.9×10−21 J. It's the Landauer's principle.
My understanding is that modern CMOS circuits dissipate around 1 pJ (10^-12) per bit, so even if we are limited by the Landauer's principle, it would still be a 9 orders of magnitude improvement. I would surely love a CPU that uses microwatts instead of hundreds of watts.
Yup, the current dominating factor is resistance, not the Landauer limit.
Biological systems supposedly operate at about one order of magnitude above the Landauer limit [0], i.e this is completely practical and has been happening before we even made computers... we probably wouldn't exist without being this efficient, imagine how much energy our cells would need to consume and emit as heat if it were similar to a CPU of today!
I apologize, I couldn't not resist to the occasion of being technically correct. I should also have stated that a superconductors based CPU would be incredibly cool figuratively and physically!
While a room temperature superconductor would potentially have some uses in building a CPU… it’s not as relevant as you might think due to the semiconductor being the operating physics at play to make the transistors and perform the electrical switching and everything stemming from that underlying electrical unit.
You could absolutely use superconductors for all the connecting wires, and with a little engineering it might also be good for conducting heat away from the chip die as part of the larger CPU package… it wouldn’t be a good interface material (pins for connecting to the motherboard) as you want oxidisation resistance and a level of ductility and malleability to help with making a very good electrical connection between the surfaces when they are mechanically pressed together. But overall it will have some uses, it’s just not an immediately applicable technology for the silicon chips themselves… lots of potential in circuit designs and I’m sure if someone invented a way to lay this material onto a PCB as easily as we can print copper traces on them today that inventor would get pretty rich… it’s just not likely to make much of a change to the silicon chip die itself due to the need for semiconductors to do the transistor switching …
Of course someone might have invented a superconducting transistor that I haven’t heard of and if that’s the case disregard most of what I’ve just written haha.
I thought there was something out there like this but I wasn’t sure if it had actually built practical devices yet, cool to see where the state of the art has gone since I was last deeply involved in the ASIC & VLSI world. From what I read there I’d expect significant developments based on a number of the technologies outlined in that article if we do indeed have a room temperature superconductor we can build miniature circuits with…. But one issue I don’t think they mentioned will be grain sizes our transistors are getting very very small with current lithography technology. And it may be difficult to get equivalently small features (to use the industry term for the wires and bits we build silicon chip transistors out of) using superconducting materials where we have to control grain sizes and multi element mixtures, there will be lots of work on it so I’m sure it may change, but there will be a fundamental difference in how small we can make the conductor if it involves having multiple elements and crystal grains in specific structural arrangements, unlike a pure metallic wire which can be as thin as a few atoms (at this size your limits entirely depend on how tolerant you are of electrons accidentally tunnelling to other nearby conductors)
> but there will be a fundamental difference in how small we can make the conductor [...]
Considering how much faster super conductor logic circuits have been demonstrated to be driven [0] it might be worth the trade even with a higher fundamental limit on feature size.
A super conducting chip with far less logic could easily beat CMOS in: (1) Power performance (2) Single threaded performance (where CMOS has stalled) - but interestingly it could still compete in total throughput if the raw frequency is high enough. i.e even though there may be far less logic available compared to the latest and greatest CMOS lithography techniques - if it runs so much faster, less or simpler cores can potentially match or beat the throughput of the more parallel and specialised but slow logic available in CMOS dies.
In short, it could be like taking a step back in time to the simpler, smaller CPU days, but a huge step forward in fundamental frequency. That actually sounds like a nice trade regardless, CPUs are so insanely complex these days.
I work in research, so I understand that it is very important to keep in mind the limits imposed by nature itself. Heck, I had more than one argument with idiot bosses who wanted to break the laws of physics or maths...
Say we use this (or any other) superconductor to build an AND gate. Assume one input is zero, so the information of the other input is lost.
I assume the energy lost as heat occurs due to the "current" going into the and gate having "nowhere else to go" other than to dissapate as heat?
If that's the case, simply redesigning our logic gates to have as many outputs as they have inputs, with some of these outputs feeding indirectly back to the power source without being read, seems feasible.
It's a thermodynamic argument. Suppose you have a system of n bits each in an arbitrary state, so it has 2^n microstates. You set one bit to 0, regardless of what it was originally. It now has 2^(n-1) microstates. Finding entropy S for a given number of microstates N, S = k_b ln N, and so dS = k_b * (ln 2^(n-1) - ln 2^n) = k_b * ln 2 * (n - 1 - n) = -k_b ln 2, i.e. entropy has decreased by a constant amount. But by the second law of thermodynamics, entropy cannot decrease in a closed system, and the way it's dissipated is as heat. How much heat has been released? dE = T*dS, so dE >= k_b * ln 2 * T. Note the dependence on temperature: you can reduce how much heat is released by having it operate at a lower temperature. Even at room temperature, however, this is a billion times less than existing heat dissipation, so there's lots of room for improvement before we start hitting the Landauer limit.
This can be worked around by introducing ancilla bits to maintain the number of states in the system, but the instant you destroy the ancilla bits (e.g. by feeding them back to the power source), you dissipate energy. The exact mechanics of this would depend on the implementation of the device you're talking about, but you'd inevitably encounter it and be unable to overcome it.
> I assume the energy lost as heat occurs due to the "current" going into the and gate having "nowhere else to go" other than to dissapate as heat?
Not quite, it's a thermodynamic principle that applies to any way you could possibly compute AND. Basically, the laws of physics are reversible, so your computation must be reversible too. There are 4 possible inputs to an AND gate, so to be reversible there must be 4 possible outputs, one for each input. But we only want one output for the rest of our computation, so the other one dump into the environment somehow.
I think we're talking about the same thing, but I explained it absolutely terribly.
If we "dump the other output" back into the power source, such as the battery, does that solve the problem of not implicitly dumping it into the environment? Or is it still destroying information?
How will you choose which output is the one with the AND result? You'd need some logic to pick which output was the right one, I assume. Then you've got the same problem again.
I think that you could move those bits to destroy them somewhere else but what I got from pezezin post is that there are many order of improvement to achieve before that loss becomes significant enough to warrant the incredible complexity of shipping wasted bits to the heat sink.
This is the first time I'm hearing it phrased this way and I wonder why it hadn't occurred to me before. Thanks so much, in any case, you have just increased my understanding quite a bit. :) slapshead
I don't think this actually affects the high level opcodes, i.e it's transparent, so long as the circuit implementing them somehow performs charge recovery, the high level programming can still appear to be irreversible (I don't want to think about the potential side channel attacks that causes when you want to zero some bits!).
Reversible logic is fun, but the memory requirements get intense. You need enough storage to retain every intermediate value used in a computation. If you have a 1GHz 64-bit processor and it does an hour-long computation, you need to store the entire 29TB history of its intermediates... and then spend an hour unwinding it!
But currently, the adiabatic chips has a bigger issue with getting to zero: their control circuitry is a bank of AWGs, each burning probably hundreds of watts at room temperature. They ideally don't produce heat in the cold zone, which is great for the cryogenic system, but if we have room temperature superconductors, that's suddenly moot.
How powerful is the magnetic field in typical brushless motor? Even if it can’t be used for an MRI machine, it could do wonders for efficient (and/or compact) robotics and electric vehicles.
I’m also very curious what kind of inductors you could make for switching power supplies using superconductors.
Do you need steel? What about a brushed(Assume some fancy future micrometer gallium filled gap that doesn't wear or something) motor that just has superconducting coils attracting each other?
You could probably do that but it wouldn't be a whole lot better than the existing traces: it's not usually the resistance that limits the size of circuit traces but the mechanical requirements, such as your ability to connect to them and to space them apart so you don't get crosstalk due to capacitive or inductive coupling.
If you could use it to make circuits, especially of a high level of integration then it might well be something much more interesting (Josephson tunneling is briefly mentioned in the article). That could theoretically give rise to very efficient switching gear and if it can be miniaturized enough to efficient CPUs and memory. This is because the typical transistor uses power mostly in the time between the transition between the 'on' state and the 'off' state, when it is acting as a resistor. If you could get rid of that resistance during the transition then you might be able to reduce the amount of power a given circuit uses, but there are still lower limits off losses that you won't be able to escape, so it will not make your CPU magically use zero energy.
Given the contents of the paper such applications are a very long way off and may in fact never happen. Let's first see (1) if it is true and (2) if it is true how well it stacks up against copper wire of the same diameter and commercially available super conductors in terms of cost and practical current carrying capability. If that's all good then this will really be a game changer.
Sometimes the resistance of the trace matters. I’m designing a motor controller that needs to carry 100 amps and the resistance of the copper means that large planes are needed to carry the current without overheating.
Beyond that I agree with everything you’re saying.
That's absolutely true, in power electronics there are applications where the current carrying capability of the traces really matters. Typically you'd either use a very wide trace or some other trick such as via'ing together multiple layers of traces or even to tin-plate the trace. In extreme cases I've seen traces reinforced with solid copper bars.
There is also extra thick copper clad board ('heavy copper PCBs').
To shed a different light on this: think of temperature as walking through mud. Your legs lose energy trying to slowly pull a lot of mud behind you. Now think about skiing. A lot less snow is dragged with you but it flies fast.
Here, what is interesting is if one fusion reaction does happen, then the alpha (helium) particles leave at 2.9 MeV. After two collisions with protons, if the second proton they have hit hits in turn a boron nucleus, then it will have exactly the right energy (612 keV) to have maximum chances at initiating a second fusion reaction.
612 keV is like almost 7 billion degrees °C if considered as thermal energy, and no experiment anywhere will get that hot for long. But compared to the energy of the exiting helium nuclei, it's still much lower (0.612 MeV vs 2.9 MeV).
In other words, instead of cascading all the energy down and hoping the sea of particles rises to a few billion degrees so enough particles do fusion to keep the sea of other particles hot, here, the energy is preempted by proton atoms after just 2 collisions and used immediately to start a second reaction, which yields more helium nuclei at 2.9 MeV, essentially producing an "avalanche" effect.
Finally, yes, they seem to have devised a way to obtain at least a small part of the energy electrically, without relying on thermal energy, via direct electric field deceleration of very fast charged particles.
This is like "the ultra rich (very fast particles) manage to create value among themselves without having to cascade their wealth down to the crowd (cold particles), and then upload that value to hyperspace (the electric field from the electrodes), without ever interacting with the mass of the crowd (the mass of the target), until a sufficient amount of fusion reactions have been realized"
And yes, a petawatt (the energy of present day ultra-fast lasers) is a lot of power. It was just chance that there was very little practical use to this kind of power - until now.
That being said, I am not a true expert myself of this topic, so the true barriers laying in front of this concept might be better explained by the other comments here.
> This is like "the ultra rich (very fast particles) manage to create value among themselves without having to cascade their wealth down to the crowd (cold particles), and then upload that value to hyperspace (the electric field from the electrodes), without ever interacting with the mass of the crowd (the mass of the target), until a sufficient amount of fusion reactions have been realized"
This was actually a helpful analogy for me. I'll have to take your word on the accuracy of it, though.
https://www.nature.com/articles/srep01170 ehich is the first linked paper in that section explains it well. The laser fires through the hole and ablates material from the back disc. The electrons from the created plasma reach the other side of the disc first before the ions because they are lighter, causing a buildup of negative charge on the other side. This charge differential drives a current between the two plates, which creates a magnetic field inside the coil.
"the ultra rich (very fast particles) manage to create value among themselves without having to cascade their wealth down to the crowd (cold particles), and then upload that value to hyperspace (the electric field from the electrodes), without ever interacting with the mass of the crowd (the mass of the target)"
Are you saying this kind of fusion is anti social justice? We should ban this immediately!
From what they show, the critical field and critical current seem very low. 2500 Oe is like 0.25 Tesla. Even REBCO at 77K is >1T. And 2500 Oe is not even at critical temperature but much lower. From skimming through the article I couldn't find the sample size of the current measurement to get the critical current density, not just current which is meaningless (and around 300 mA).
This means you can't actually push big current through this thing (yet). You can't make a powerful magnet, and you can't make viable power lines, both applications that were the hallmark of "room temperature superconductor revolution".
Of course, maybe one or a few more tweak(s) of the material and boom, it will give high J_c and B_c. I really hope it does, it would be super cool!