Hacker Newsnew | past | comments | ask | show | jobs | submit | MarkusWandel's commentslogin

Shannon's information theory asssumes optimal coding, but doesn't say what the optimal coding is.

Anyway if a single observer who lies 20% of the time gives you 4 out of 5 bits correct, but you don't know which ones...

And N such observers, where N>2, gives you a very good way of getting more information (best-of-3 voting etc), to the limit, at infinite observers, of a perfect channel...

then interpolating for N=2, there is more information here than for N=1. It just needs more advanced coding to exploit.


I don't have the math for this, but a colleague does, and after spending a few minutes on Matlab came up with about 0.278 bits/flip of channel capacity (Shannon) for the single observer, and I think around 0.451 bits/flip of channel capacity for the dual observers. That's the theoretical capacity with optimal coding. Whatever coding schemes need to be employed to get there, i.e. what redundancy to add to the bit stream... that's the hard part.

"get not you might it but"

The 1541 is a computer, as defined by "can load and run a program". Enough protocol exists on the stock drive/IEC bus/software to do this. Fast load programs used this and I'm sure some copy protection schemes did.

But it's a computer in the same way as a bare-bones microcontroller with an ARM core is, say, the one in your car keyfob. Sure the CPU is capable but paired with just enough ROM and RAM to do the job it needs to do. And in the 1541's case that was only 2KB of RAM.


Unix oldtimer here (first exposure: 1987). A lot of copy/pasting is at the shell prompt. Aside from being super lightweight - just select something in previous output, e.g. a file path, middle click, and done - what about the key bindings? All the world uses ^C for copy, but that already does something conflicting at the Unix shell prompt.

I have to admit that I do feel like an oldtimer though. What I do at the shell prompt, others do in VS Code, and probably 10x faster once they're good at the GUI. So maybe super-lightweight copy/paste at the shell prompt just doesn't matter that much any more.


That is also the one good thing about Window's commandline, you use right click there to copy and paste which is nice. The rest sucks.

I cannot stand the Windows user experience in their command line. The Linux method actually has to software registries that allow for different content to be copied and pasted.

Oh I used CTRL+C to copy something but I need something copied first, highlight paste with middle mouse and paste with CTRL+P.

On Windows you must destroy the content of the CTRL+C and replace it with what the middle mouse can do, go back to the first source to copy and paste again.


You want a clipboard manager/history. You are using middle button paste as a work around for how hard it is to find a good clipboard manager (I'm not sure if one exists...)

I have and use all three on Linux. I only use Windows at work the IT is strict.

Tangential - what do people do faster in vscode than on the terminal ?

The whole "integrated development" experience. Take it or leave it, but old farts like me go all the way back to poring over code on printouts since your only window into it was one file at a time in an 80x25 terminal - not terminal window, actual terminal or, by then, terminal emulator.

That does affect later habits like, for example, hating the information overload from syntax highlighting. And don't even get me started on auto-indent.

Whereas younger colleagues, whose habits were formed in the era of much more sophisticated tools, have dozens of files open in tabs and can follow, say, a "where is this defined" or "where is this driven" (this is RTL code, not normal software) in an instant. Keep in mind some oldtimers had really fancy emacs setups that could do that, and vi users had things like ctags.


They imagine that they're being more efficient.

I argue the opposite! Phone cameras, while hardly perfect, are easily on par with the sort of cameras people used to do street photography with, and improving constantly.

I too remember the "no photos" rules - in the pre-smartphone era. Technically you weren't even supposed to bring a camera in to the workplace (though this was mostly unenforced).

Now you can take pictures and videos of everything, willy nilly, and nobody bats an eyelash. With a camera that you always have with you, whether you anticipated taking photos that day or not.

And yeah, you can't play shallow focus games (notwithstanding that the phone will fake shallow focus with algorithm). And you don't get real zoom (pinch zoom doesn't count).

Oh, on the "real camera" front. Show up with a Canon SX30 ("big" camera, lots of glass in front) and people might notice. But show up with an SX210 (these are cameras I happen to have) and you can get great stealth shots with its 14x zoom but no one the wiser. It's just a small point and shoot, harmless, right? This thing is leaps and bounds more capable than a camera that size back in the pre-digital days.

I'll bet a Gopro will get a pass too.


This used to actually work, at least on some sites. The text would load first, then it would reformat as the fonts and CSS assets are loaded. Ugly and frustrating, which is probably why, now you don't get content until the eye candy is all ready.

But the progressive, text first loading, would be readable from the get go, even if further downloads stalled.


But profits!

What I don't get about these laser defense systems: Doesn't the attacker just have to attack on a foggy day?

If Hamas could only fire rockets on foggy days, Israel would have many more rockets-free days.

It's absolutely an issue, although this is outside the visible spectrum and degradation may be a bit less severe compared to visible light.

Resonance! Very minor earthquakes can knock picutures off the walls, items off the shelves etc. if they just happen to hit the right resonant frequency. So if you flood the area with 8Hz-ish acoustic energy, some stuff will start to shake.


My main computer monitor, ancient now (a Dell U2711), was a calibrated SRGB display when new and still gives very good colour rendition.

Are movies produced in this colour space? No idea. But they all look great in SRGB.

A work colleague got himself a 40" HD TV as a big computer monitor. This is a few years ago. I was shocked at the overamped colour and contrast. Went through all the settings and with everything turned to minimum - every colour saturation slider, everything that could be found - it was almost realistic but still garish compared to SRGB.

But that's what people want, right? Overamped everything is how those demo loops at Costco are set up, that's what sells, that's what people want in their living rooms, right?


> But that's what people want, right? Overamped everything is how those demo loops at Costco are set up, that's what sells, that's what people want in their living rooms, right?

I just want accurate colors to the artists intent, and a brightness nob. No other image “enhancement” features


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: