The lack of clarity is in keeping with the USB C connector itself, which may supply or accept power at various rates or not at all, may be fast or slow, may provide or accept video or not, and may even provide an interpretation of PCI Express but probably doesn't.
It probably looks the same no matter what, and the cable selected to use probably also won't be very forthcoming with its capabilities either.
The USB A connector stayed the same between USB 1, 2 and 3. Yet most manufacturers voluntary distinguished them by giving USB 1 and 1.1 a white insert in plug and port, USB 2 a black insert and USB 3 a blue one
This was neither standarized nor enforced, yet it worked remarkably well in the real world
Then we decided to just have no markings at all on USB C cables. On the ports at least we occasionally get little thunderbolt or power symbols
My audio interface is a Linux computer with FPGAs inside (that actually get field-programmed), with two gigabit Ethernet jacks that each talk to different parts of the machine.
But I don't think anyone here would care about that. It's not such an unusual arrangement. I guess it's kind of impressive to use it on my desk at home, but in pro audio world it's actually kind of mundane.
Maybe I'll write about it more after I get the gumption to gain a root shell on it (or brick it, whichever comes first). I think you guys might find that part more interesting. :)
I’m building an audio device. It runs Linux for the control plane (it’s just a CM4 running Yocto, maybe I’ll leave SSH running on production units, maybe not, haven’t decided yet). No audio passes through the CM4, there’s a dedicated FPGA and MCU for that. It’s been a fun project, first time hardware for me, feel free to ask my anything!
This particular box also has RS-232, ssh (with almost zero auth), and telnet as a control plane, by default. Any of that only gets used to tweak/report various things with a rather basic human-readiable protocol. (It has built-in functions to make it more secure; I just don't care on my home LAN, or on my pop-up LANs in the field. A sane person with a professional role would have it locked down and on its own VLAN/VPN, but for me and prototyping: Telnet is actually pretty good.)
I designed none of it. I just bought it, and make good use of it. New, it was a mid-4-digit box; used, they're not so bad. (And I use it every day and like it quite a lot, hence the reluctance to go harder on the potential root shell hack.)
My box, as it sits, just does general-purpose GUI-connected DSP stuff with near-realtime tweaking. I'm in the process of getting it to grok OSC, and thus Reaper or whatever, so it has a better control surface for live work.
It has a USB interface that my Linux box treats as a sound card, which works well. My main reason for wanting to get root is to examine (solve?) its ~5-minute boot times.
5 minutes in a live sound environment is the difference between having a large, active, and involved crowd, and having everyone get bored and find something else to do.
Anyway, the FPGAs here just exist to behave as DSPs and...well, digitally process [audio] signals. It works well; I really just wish it booted faster.
And that may be its downfall. :P
---
But enough about that.
What's your device do? What are your plans and dreams with it? (Do I want one?)
I've built a very small amount of hardware. At least at the level of custom PCBs and some code, it's been richly rewarding even when I screw it up, and it makes me feel like I'm on top of the world when I get it right.
Re: yours, that is a _long_ boot time. Boot time on mine isn't great, but I think I'm just going to have to accept that as an artefact of U-Boot, Linux, and an Ethernet switch chip that takes some time to initialise.
Anyway, re: my widget: it's a personal monitor mixer [1], something one might use in the studio or live, not dissimilar to existing products in the market, except: it supports up to 64 channels of Dante or AVB natively, it has a super nice (HiDPI) UI, and absolutely everything is remote controllable using OCA (AES70) or OSC. I even have a MCP bridge so you can let Claude manage it ;) [2]
The hardware is a custom board that hosts a CM4 SOM (for the control plane and UI), a Brooklyn 3 SOM (Dante), and an XMOS which runs the mixer firmware and AVB stack. There are also some nice AKM DACs, and a Marvell Ethernet switch chip that connects the SOMs and XMOS to two external Ethernet ports.
The CM4 runs Yocto which manages the switch in DSA mode (i.e. hardware offloaded bridge), runs the gPTP and SRP stacks for AVB, the OCA daemon, and the UI (which is just a regular OCA client). SSH is presently enabled but there's not a lot to do once you're in there. Working on secure boot at the moment with U-Boot and dm-verity.
Dedicated test gear is different echelon. We've got some crazy-expensive RF test gear where I work that cost us way more than my house. That's an awesome corner of the world, with a combination of robust-but-fickle at every corner.
The sales volume is low, and the development cost is expensive, so the cost to purchase is also expensive. It's an interesting thing to think about, market-wise.
SoundWire. That's an internal[ish], hard-clocked, multipoint, digital audio bus, yeah? I don't know much about it. Looks like it's mostly useful for OE car audio applications?
---
This box I have is just a finished, retail-product, general-purpose pro audio DSP with a good amount of practical analog and digital audio IO. There are many others like it in the marketplace that do very similar things, but this one has a CVE that I want to exploit for my own purposes. :)
---
I really hate being secretive. I strongly prefer to just chat about stuff here, or there, or anywhere.
But even though I'm just some dude in Ohio, my HN comments consistently show up near the top of Google search results when looking at specific topics that I've covered, sometimes just in-passing, so I'm inclined to keep the details to myself for now.
I mean: In the grand scheme of things I haven't even been posting regularly here for very long, but more than once already I've Googled a question and found a link to an answer in my own comment here.
That can be problematic.
This is a great forum for open discussion, and for releasing information, and it is absolutely the wrong forum for secret skunkworks.
If I had a spare box so I could afford to potentially fuck this one up forever, I'd get on with it already. And then, of course, I would publish the results.
I wish I could spill the beans already and maybe get some great help from someone here who does this stuff routinely, but that scares future-me. If the devices can be rooted, then I want them all to be rooted (if useful) -or- better-secured (if not useful).
That sounds fine, except I don't want them to become botnet members, either.
It's a dilemma. There's a lot of this shit out there in the world that doesn't get updated.
We don't place any value on the CE mark in the States.
A lot of consumer electronics need to be FCC compliant, which involves a process of proving that the device doesn't emit too much of the wrong EMI/RFI in the wrong places.
And safety-wise, we use tend to use ETL, UL, and CSA for testing. These are third-party Nationally Recognized Testing Labs, and their own marks are used on devices they approve. But they're only really concerned about the safety of a product. In very broad strokes: If the device is proven to be unlikely-enough to burn a house down or cause electrical shock to humans, then it gets approved.
CE is a whole different thing. No government body in the USA requires or respects a CE mark on consumer goods; that mark doesn't hold any legal weight here.
Whether good or bad, CE is just not how we roll on this side of the pond.
(Of course, none of that means that laws in the EU don't affect product availability and features here. Globalization be that way sometimes.)
It saves on rewiring stuff. Maybe there's only one person talking today. Maybe they're using PC A, or perhaps they're using PC B instead.
Or maybe there's two people in the room, each on different channels altogether. In this case the other person is just uncorrelated background noise instead of a persistent echo.
Or, in-context: There's two people in the same room, both talking on the same Discord channel.
Anyway, audio routing is useful. Being able to route audio with two different PCs is a pretty neat feature of the rodecaster.
I'm not sure if it was what OP meant, but it's arguably a good availability technique (as long as you can generate the checksum, that is). Like, if I want to run custom firmware and flash it, having a checksum which verifies that the firmware isn't corrupted may help prevent bricking.
I wonder if the cutoff date is the result of so many people posting about the date over time and poisoning the data. "Dead cutoff date theory," perhaps.
Whatever it is, the cutoff date reporting discrepancy isn't new. Back when Musk was making headlines about buying/not buying Twitter, I was able to find recent-ish related news that was published well after the bot's stated cutoff date.
ChatGPT was not yet browsing/searching/using the web at that point. That tool didn't come for another year or so.
I mean, my PSP from <20 years ago doesn't support WPA2 or 3 and therefore can't talk to my home wifi unless I made a hole for it.
But as we all know, Italian-made boutique home appliances are different. They have a rich history of having timely manufacturer-supplied technology updates provided as the decades press on.
We know this to be true, just as we know that sarcasm is a myth.
I've been using the same Anker charger brick since 2014. It was $13.99, delivered.
It has two USB A ports. It has always charged everything at a good rate, regardless of brand, model, or age. It's reasonably-compact, the prongs (it's made for US plugs) fold for convenience when traveling, and it is UL listed.
Its present duty includes keeping an iPad running 24x7 and also charging my phone every night. It has charged my phone many thousands of times so far.
I'd update it to something newer, with USB C and USB PD and the bee's knees, but this old Anker thing is exactly the right kind of consistent and boring.
I don't think about it much because it has given me no reason to think about it.
That kind of boring behavior is remarkable, I think. So many other charging bricks I've used were just trash to use (slow or fickle, causing me to waste time with a USB power analyzer before giving up), or they died prematurely.
Same with the powered Anker USB 3.0 hubs on my desk. Those have only seen about 5 years of continuous use but so far they've been resolute in their trouble-free performance.
This stuff seems to be very much buy-once, cry-never.
For instance: Back in the Bad, Old Days, charging phones (especially smart phones) wasn't quite as simple as today.
The aftermarket cables were shit. Brands came and went overnight (they still do, but they did then too), and even if a person eventually found some cables that worked then it was hard to get more of them later.
The aftermarket charging bricks were shit. I had some that would make capacitive touchscreens go crazy. Some that barely worked. Some that got stinky-hot.
The phone might have a USB port that looked about like all the others, but that didn't mean much: Different phone models had different ways for signalling/confirming/accepting charging capabilities, and they rarely lined up with the method a random charging brick used.
Get the wrong combination on this double-locked mystery box, and it was possible to plug a phone and have it say it is charging -- even though the reported battery SoC is dropping before your eyes.
That was the market. It was fragmented and dysfunctional, and the only sane method to simply charge a phone was to use OE cables with OE power bricks, for real money.
---
Then Anker showed up, kind of out of nowwhere. And they were all like "Uh, guys? We sell stuff that actually works."
And they were right. They put together cables that consistently didn't suck (which should not be hard, except...). They started selling charging bricks that worked well with most or all of the phones on the market -- fooling them into thinking they were talking to their OE brick so they'd behave themselves.
It had been a terrible mess. A complete crapshoot.
And then, Anker products just plugged in and worked. They did all the things they said they'd do.
They did it so well that they raised the bar for the entire industry.
And, nowadays, it's not so bad. It's easy-enough to get a reliable cable or a charging brick that isn't a complete turd from a variety of names. That's not a thing that most of us think about much, if at all.
But man, it was fucked up for a long time before Anker stuff became common.
Didn't it come out that their cameras were uploading everything to the cloud even though they swore it didn't? I feel like I remember being very disappointed with Anker for something...
They own Eufy which sells cameras with main feature being “no subscription needed”, that are very unreliable and full of ads (which isn’t being advertised as much as lack of subscription). They do also go big on labelling a lot of simple features as AI where in reality it’s something as simple as “detect a person in a photo”.
I have Eufy cameras and it’s complete garbage, sadly competition is also mostly garbage.
Bold unsustainable claims at st the core of their business, it’s not just thumbnails.
I found https://www.youtube.com/watch?v=a_rAXF_btvE to be more balanced than the LTT video, but I think it mostly depends on your expectations of the cameras. The videos themselves are stored locally, not in the cloud. But if you have thumbnails turned on in the notifications, then the thumbnails have to be stored somewhere temporarily (I think this is an Apple/Google requirement), and they're being stored on a cloud server rather than in your home network (which would require opening up a port).
The lack of clarity is in keeping with the USB C connector itself, which may supply or accept power at various rates or not at all, may be fast or slow, may provide or accept video or not, and may even provide an interpretation of PCI Express but probably doesn't.
It probably looks the same no matter what, and the cable selected to use probably also won't be very forthcoming with its capabilities either.
(Be sure to drink your Ovaltine.)
reply