Hacker Newsnew | past | comments | ask | show | jobs | submit | jjk7's commentslogin

They'll resent you insofar as it was confrontational vs. collaborative. If you can incept your conclusion into others they will not resent you. It's the whole raison d'etre of the Socratic method.

I had someone tell me, earnestly, that they hated me because it turned out that I was alright right. Not in the stubborn sense either.


>Reporting Mechanism: In countries with Intergovernmental Agreements (IGAs), such as Canada, financial institutions report to local tax authorities, which then share the information with the IRS.

Probably accessibility APIs

Which specific ones though allow you to send input to a window without raising it? People have been trying to do "focus follows mouse [without auto raise]" for a long time on mac, and the synthetic event equivalent to command+click is the only discovered method I'm aware of, e.g. used in https://github.com/sbmpost/AutoRaise

There is also this old blog post by Yegge [1] which mentions `AXUIElementPostKeyboardEvent` but there were plenty of bugs with that, and I haven't seen anyone else build on it. I guess the modern equivalent is `CGEventPostToPSN`/`CGEventPostToPid`. I guess it's a good candidate though, perhaps the Sky team they acquired knows the right private APIs to use to get this working.

Edit: The thread at [2] also has some interesting tidbits, such as Automator.app having "Watch Me Do" which can also do this, and a CLI tool that claims to use the CGEventPostToPid API [3]. Maybe there's more ways to do it than I realized.

[1] https://steve-yegge.blogspot.com/2008/04/settling-osx-focus-... [2] https://www.macscripter.net/t/keystroke-to-background-app-as... [3] https://github.com/socsieng/sendkeys


You don't actually need to send CGEvents to UI elements to make them do things ;)

Could you elaborate on what you mean? My understanding of the Cocoa event loop was that ultimately everything is received as an NSEvent at the application layer (maybe that's wrong though).

Do you mean that you can just AXUIElementPerformAction once you have a reference to it and the OS will internally synthesize the right type of event, even if it's not in the foreground?


yes you can do a lot background UI interaction using the AX APIs. Displaying a second cursor is also simple, just a borderless, transparent window that moves around.

For the few things you cannot achieve with the Accessibility API's there are ways to post events directly to an app - even though CGEventPostToPid is mostly broken when used on its own. These require a combination of CGEventPostToPid and CGEventTapCreateForPid. (I have done a lot of this stuff in my BetterTouchTool app)


Neat, good to know! And it does seem my mental model of event loop was broken. Accessibility related interactions don't have any related NSEvent.

They are handled as part of the "conceptual" run loop, but they seem to be dispatched internally by AXRuntime library from a callback off some mach port. And because of this, the call to nextEventMatchingEventMask in the main -[NSApplication run] loop never even sees any such NSEvent.

    -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:]  (in AppKit)
        _DPSNextEvent  (in AppKit)
          _BlockUntilNextEventMatchingListInModeWithFilter  (in HIToolbox)
            ReceiveNextEventCommon  (in HIToolbox)
              RunCurrentEventLoopInMode  (in HIToolbox)
                CFRunLoopRunSpecific  (in CoreFoundation)
                  __CFRunLoopRun  (in CoreFoundation)
                    __CFRunLoopDoSource1  (in CoreFoundation)
                      __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE1_PERFORM_FUNCTION__  (in CoreFoundation)
                        mshMIGPerform  (in HIServices)
                          _XPerformAction  (in HIServices)
                            _AXXMIGPerformAction  (in HIServices)

In some sense this is sort of similar to apple events, which are also "hidden" from the caller of nextEventMatchingEventMask. From what I can see those are handled by DPSNextEvent, which sorts based on the raw carbon EventRef. aevt types have `AEProcessAppleEvent` called on them, then the event is just consumed silently. Others get converted to a CGEvent and returned back to caller for it to handle. But of course accessibility events didn't exist in Classic mac, so they can't be handled at this layer so they were pushed further down. You can almost see the historical legacy here..

[1] https://www.cocoawithlove.com/2009/01/demystifying-nsapplica...


Maybe they used Claude to come up with a good method to do this. /s

But I was also wondering, how this even works. The AI agent can have its own cursors and none of its actions interrupt my own workflow at all? Maybe I need to try this.

Also, this sounds like it would be very expensive since from my understanding each app frame needs to be analysed as an image first, which is pretty token intensive.


There are better ways to analyze on-screen content than images.

Allbirds ran on Shopify... was the CTO a shoe engineer?

At least tokens are equivalent to measuring 'thinking'... I wouldn't mind if it burned 100k tokens to output a one line change to fix a bug.

The problem is maximizing code generated per token spent. This model of "efficiency" is fundamentally broken.


Why worry?

"I used to be with it, but then they changed what it was. Now what I'm with isn't it, and what's it seems weird and scary to me, and it'll happen to you, too." - Abe Simpson


Use the atmosphere itself as propellant gas.


Tiled at different zoom levels


Worked at a place that used to do a kind of arbitrage between adclicks and traditional print. A large percent of traffic, especially mobile, was obviously either toddlers or bad bots; yet we were billing our customers for the 'engagement'.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: