Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Please elaborate?


Current electron based computers are 10s of nanometers per transistor. Optical equivalent of transistor cannot be smaller than 1um. Equivalent optical CPU to your smartphone would be the size of several football fields.


Asking from a position of total ignorance. The energy savings mean you can increase clock speeds, right? Assuming a big enough jump, won't that relieve a CPU from the need to have most specialised instruction sets and potentially also that many cores? In that case, wouldn't it be acceptable that transistors grow (back) in size?


Energy savings on what basis?

If your gate gets 50x50x50 times bigger, you need some pretty extreme savings per area/volume of circuit if you want to reduce the per-gate usage. Can they save that much?


I would think yes. Which would be something. Huge rooms of enormous optical computers running lightening fast on low power would have a kind of retro future feel.

Light would reduce the time cost of distance, and increase the density of connections (optical signals can pass through each other) so this could actually work.


> Optical equivalent of transistor cannot be smaller than 1um

For classics optics. Exists superlens optics, which using metamaterials and monochromatic light source, and could "see" artifacts of size much less then wave length.


While this is true doesn’t ignore the difference in clock rate capacity ? If the photonic cpu can run 10,000x the clock rate without the extreme heat build up that would melt the smartphone


Isn't part of the point that you don't need as many transistor equivalents because you can run them thousands of times faster?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: