Is there any reason to use Office nowadays except for being able to open documents sent by institutions where secretaries still use Word/Excel/PPT? (universities, etc.)
Excel is the best spreadsheet software in my experience when you have to move beyond the basics. I’ve even tried hard to use the open/Libre alternatives.
Hacker News is a different world than the target customer base for these products. If your use case for spreadsheet software is putting things into tables with some formatting and some light formulas then all of the products will do the same job.
For professionals who use these tools, suggesting they use LibreOffice or something is the equivalent of someone coming to you and suggesting you give up your customized Emacs or Visual Studio Code setup in favor of Notepad++ because they both edit text and highlight code.
> Excel is the best spreadsheet software in my experience when you have to move beyond the basics. I’ve even tried hard to use the open/Libre alternatives.
I strongly agree, but even for the basics! I use LibreOffice for personal use and put up with it only because it’s not Microsoft. It’s laggy, copy paste sometimes doesn’t work, the user interface is quite dated, the fonts are ugly…the list goes on. I donate to Document Foundation so that it can get better, but it moves very slowly.
> Excel is the best spreadsheet software in my experience when you have to move beyond the basics. I’ve even tried hard to use the open/Libre alternatives.
I agree 100% with this, since I've been trying the same. Although I do think some power-users take it way too far and should be using more robust data analysis tools (Python, DBs) instead of having these monstrous Excel spreadsheets with millions of rows and columns.
I tried showing a finance guy a Python version of a levelized cost of electricity spreadsheet he made. He laughed in my face and continued using Excel to drive executive decisions.
> Excel is the best spreadsheet software in my experience when you have to move beyond the basics.
This might be true. But most Excel users just use the basics and would be well served to switching to a Free Software alternative such as LibreOffice Calc. Which, is also capable to be used in advanced contexts; although for those cases it is different than Excel, admittedly.
I think most of the excuses saying why people don't switch to Excel alternatives are simply coverups for inertia. I understand that; getting out of the comfort zone is difficult. But it's not impossible.
I strongly disagree. If you double-click on a CSV, excel usually opens it in your local code page instead of UTF-8, but they got rid of/hid very well the old text import function so now it fires up PowerQuery when you import a CSV instead. PowerQuery is OK but it doesn't like irregular data. It also saves the query connection automatically. If you massage the data in PQ before you import, it's unlikely that someone who comes after you will know what to do with the query you made. They don't make it easy to can the query to use in the future with similar files. Actually, they make it pretty difficult.
LibreOffice Calc just gives you an import window with some pretty good defaults, like UTF-8. It could be better, but at least it is not worse.
Excel added useful array functions. Good luck finding anyone who can handle that.
Tables in Excel are not really first class citizens. They move differently than everything around them but they don't have an obvious interface for working with them from other parts of the spreadsheet. Within a table you can refer to rows by name, but not outside, really. If you click on a pivot table for a reference, it gives you a GETPIVOTDATA function, when you might have actually wanted E3 or whatever.
And don't get me started on "dates", "numbers", "text", etc., excels weakly strict datatypes.
Calc has many features focused on correctness and reliability. Excel is a joke on both of those accounts.
Turns out close to 100% of the spreadsheet users out there don't care about that. It's unnerving and absurd, and IMO, what is even the point of all the effort of entering your data and working it if you don't care about the result being correct? But that's how the world is.
For an individual, probably not. I've been an OpenOffice and LibreOffice use for my personal use and contracting business since 2004. I've had no need for "real" Microsoft Office in that time. I also don't deal in macro-encrusted documents or with more esoteric features.
For an org where individual users aren't technical I'd never try to get by w/o Microsoft Office. The assumption by all large orgs. that you're going to use Microsoft Office is pervasive. Even if the Free Office suites work fine tech support is always going to be mired down in compatibility issues, both real and perceived.
>When Visicalc was released, Perez became convinced that it was the ideal user interface for his visionary product: the Functional Database. With his friend Jose Sinai formed the Sinper Corporation in early 1983 and released his initial product, TM/1 (the "TM" in TM1 stands for "Table Manager"). Sinper was purchased by Applix in 1996, which was purchased by Cognos in late 2007, which was in itself acquired mere months later by IBM.[3][2]
TM1 is widely used as a way to interface with official ledgers.
Yup, I have. And had to deal with converting "this awesome tool that does X, Y and Z" to an actual multi-user app because it was just so great. You end up discovering that there are tons of miscalculations in these formulas that only surface when you start writing tests for them, and that a lot of the business decisions based on these tools had flawed assumptions.
Having said that, I love that Excel has democratized app-building and made it easier for people to solve their own problems. In terms of alternatives, I think it's more about the UI and mental model that people have when using Excel, not necessarily the functionality. There may be 1-to-1 competitors in terms of functionality, but in terms of UX, Excel is sort of king.
My first job out of uni was developing a devops pipeline for Excel spreadsheets after one went rogue and cost the broker trader I was hired by $10m in one fun afternoon.
An application I consulted on was a web interface that made heavy use of the Excel portions of Microsoft Graph so that the finance team could continue to send clients spreadsheets that they could adjust without also sending them the formulas to "steal" (and take other parts of their business elsewhere, to noticeable millions of dollars of project spending habit shifts). The finance team wasn't going to stop using Excel ("how dare you suggest it"), so it was wildly custom solution to figure out where formulas existed in any of the spreadsheets finance felt like giving to the app, build a custom UI for entering the inputs to those formulas, run those formulas most with Microsoft Graph cloud magic/some with other web libraries, and return the results.
If it were just about any other group than that company's "finance department" that so deeply wanted "just tightly wrap Excel in a web UI and leave the key computations as Excel formulas we can continue to edit in Excel because all we want to understand is Excel" project would probably have been rightfully laughed out of the room. Finance has the keys to a lot of companies and like keeping those keys for comfort in Excel.
I would like to see the finance team that codes all their own C code and is adamant it needs to be in Emacs, especially because if they are that deep in Emacs I'd be wondering why they are insisting on C rather than Emacslisp or something even more esoteric like GNU Guile or someone's custom Forth to Fortran compiler…
But to answer the question, that is where I finished. We weren't "okay with it" that the finance team insisted on a C# to Excel files in SharePoint/OneDrive via Microsoft Graph turducken. We lived with it because the finance team had enough of the metaphorical keys to the car to be deeply in the driver's seat of that project. Sometimes you just have to grit your teeth and deliver what the customer wants.
I know a few people who use Quantrix for financial modelling. It is an exceptionally nice piece of software, basically the successor of Lotus Improv, with Improv's more robust and auditable separation of data and formulas.
I used Apple Numbers for all my spreadsheeting so it depends what you mean by "serious financial work". The vast majority of folk could probably get by without using Excel I am guessing.
Power Query + Power Pivot + M. I don't use formulas in cells. The sheets are just a canvas for Pivot Tables, final tables, and charts connected to the data from Power Query and Pivot.
I deal with hundreds of API integrations involving various JSON, CSV, TSV, and XML files with mixed validity. My workflow: Notepad++ for a visual check -> Prototype everything in Excel. I give users a "visual", connect it to real data, and only then migrate the final logic to BI dashboards or databases.
Nothing else delivers results this fast. SQL, BI tools, and Python are too slow because they generally need "clean" data. Cleaning and validation take too much time there. In Excel, it's mostly just a few clicks.
PS: I spent 2 years (2022-2023) using LibreOffice Calc. I didn't touch Excel once, thinking I needed to break the habit. In the end, I did break the habit, but it was replaced by a pile of scripts and utilities because Calc couldn't do what I needed (or do it fast enough). The experience reminds me of testing Krita for 2 years (2018-2020) — I eventually returned to Adobe Photoshop (but that's another story).
PS2: About (Query + Pivot + BI). This allows you to process millions of rows (bypassing grid limitations). It also allows you to compress everything into an OLAP cube, taking up little space and working quickly with data.
Interesting. I'm not experienced in data cleaning.
About Python vs Excel:
Isn't manual cleanning of data in Excel prone to permanent error? Because:
- it's hard to version control/diff
- it's done by a human fat fingering spreadsheet cells
- it's not reproducible. Like if you need to redo the cleaning of all the dates, in a Python script you could just fix the data parsing part and rerun the script to parse source again. And you can easily control changes with git
In practice I think the speed tradeoff could be worth the ocasional mistake. But it would depend on the field I guess.
> - it's hard to version control/diff
As I mentioned, this is only prototyping.
After that, we move on to implementation in code, knowing what we want to see in the end and understanding the nuances of the data itself.
> - it's done by a human fat fingering spreadsheet cells
No one is entering anything into the cells, please reread the message.
> - it's not reproducible. Like if you need to redo the cleaning of all the dates, in a Python script you could just fix the data parsing part and rerun the script to parse source again. And you can easily control changes with git
And that's what I said above. That it takes longer. Why use git/python when I can do it in a few clicks and quickly see a visual representation at every step?
> In practice I think the speed tradeoff could be worth the ocasional mistake. But it would depend on the field I guess.
Another sentence that shows once again that you haven't read what was written.
For enterprises it almost always comes down to - does it reduce risk, is it easy to manage, authentication & authorization features, is it good enough & is it compatible with our current stuff.
While I agree, there is no reason NOT to use a perpetual license (e.g. for Excel 2016), unless you really, really need the subscription-based version.
You may notice the last edition of softwares that had perpetual licenses but moved on to subscription model tend to be very expensive today as they are no longer sold and people know how to count. So, let's use the opportunity while it lasts as I don't believe the end of perpetual licensing for Office (or Windows for that matter) is far away.
Sharepoint and office is the modern version of cancer. Nobody wants to manage onprem AD and mapped drives because cLoUd is the solution. Doesn't help that Microsoft stopped caring about onprem.
Excel has no competition whatsoever in the local software space. Google Sheets is somewhat useful for 80% of users but for people who must be on-prem/local it’s Excel or nothing.
Someone really should Pixelmator Excel. That’s a viable startup, I think, though I have no idea what the GTM looks like. Some killer feature/perf that makes people install it alongside?
I just open them in Google Docs/Slides and then export later to the original format after edits. I’m sure it’s not feature complete but it’s good enough!
In practice, it's like how having Adobe Reader used to be. You mostly don't need it, but occasionally you need it for interoperability with other people, such as lawyers.
Otherwise, I keep it around for the desktop Excel app. Still my preferred spreadsheet app, even though Google Sheets does pretty much all of what I need.
ChatGPT is amazing for interpreting test results. Of course you should back it up with a doctor.
Back when 3.5 came out I gave it some information about me when I was a teenager on a condition that (multiple) doctors totally misdiagnosed. It immediately told me three tests I should have done, two of which would have diagnosed it right away. Instead, I had to deal with extreme fatigue for over a decade until I finally did research on my own and had those same tests done.
As far as test results go, right now we’re dealing with our dog having increased thirst. She’s been on prednisone for a year, and that’s not an uncommon side effect. We brought her in to the vet and they tested her and diagnosed in as stage one kidney disease, with no mention of the prednisone. I put those results and her details into ChatGPT and it told us it could absolutely be the prednisone, and told us we could use an inhaler for what we were using the prednisone for - chronic bronchitis. Our vet never offered than option. We’ll find out in a few months if she actually has kidney disease or not, but chances are it was just the prednisone.
As a bonus, the vet before this one diagnosed her bronchitis as heart failure. They didn’t run any tests, scans, etc. Just “sorry, your dog is going to die soon.” What a fun week that was.
ChatGPT is an amazing second opinion tool. Obviously you need to ask it neutral, well formed questions.
Yeah its a self-made-help article, if you don't know any better this is what you do. It doesn't make it the best choice overall though.
It feels like the guy had a... mediocre GP, got scared by skin cancer diagnosis and over-corrected to most expensive path possible and since stuff was found out we have this article, roughly correct but written in a sensationalist (or freaked out) style. Some claims are outright false (like GPs not knowing heart disease is the biggest killer... really).
Wife is a doctor with overreach between public and private healthcare, and those private services also have their own motivations which aren't often straightforward help-as-much-as-possible, rather milk-as-much-as-possible with tests, scans, long term treatments and so on. Especially CT scans pour non-trivial amount of radiation on the body that on itself can cause cancer down the line.
With public healthcare you at least know primary motivation isn't cash flow but helping patients, the issue is rather overwhelmed resources with limited time per patient. It always depends on individual, as with engineering there are better and worse, yet we all somehow expect every single doctor to be 100% stellar infallible expert with 150 years of experience across all branches of medicine (absolutely impossible for any human being). Look around at your work if you are an engineer and perceive the spread of quality/seniority of each colleague. Same happens in medicine, just stakes are (much) higher.
I think it was mostly the eager evaluation that made it possible to debug every step in the network forward/backward passes.
Tensorflow didn't have that at the time which made debugging practically impossible.
- It is extremely slow and resource intensive. Opening any link/page takes at least 3 seconds on my fastest computer, but the content is mostly text with images.
- It cannot be used without JS (it used to be at least readable, now only the issue description loads). I want the bug tracker to be readable from Dillo itself. There are several CLI options but those are no better than just storing the issues as files and using my editor.
- It forces users to create an account to interact and it doesn't interoperate with other forges. It is a walled garden owned by a for-profit corporation.
- You need an Internet connection to use it and a good one. Loading the main page of the dillo repo requires 3 MiB of traffic (compressed) This is more than twice the size of a release of Dillo (we use a floppy disk as limit). Loading our index of all opened issues downloads 7.6 KiB (compressed).
- Replying by email mangles the content (there is no Markdown?).
- I cannot add (built-in) dependencies across issues.
I'll probably write some post with more details when we finally consider the migration complete.
It's an excellent choice. Though Microsoft alone should be a sufficient answer. Many people will never interact with github projects because it requires an account with the most unethical company that ever existed.
There's companies out there whose main source of business is wetworks, as in drone strikes and so on, microsoft going for the most unethical title, I don't think its even in the ranking.
I do agree that not using it is better morally, however, given the limitations of git vs fossil, which carries the issues and wiki inside the repo itself, its not a good idea to switch to another service without guarantees that its host will be forever standing, github won't die in the next decade, but the alternatives mentioned might. even google (code) got out of the source hosting business.
But your confidence in GitHub's continued existence comes from its network
effects, no? And competing services can only gain such network effects if
more people use them. So to me this feels like a defeatist argument.
This is not a magical achievement, github is solvent and its business model is solid, give the same amount of users to any other service and it might collapse. Any service has to scale and the more users it has the more costs it incurs, nonprofits are a risk, the moment they run out of money, the service collapses.
If it's not a magical achievement, then surely competitors could replicate
it too.
Of course you can't put a million users today in a service used by a
thousand yesterday, but I don't buy the "non-profits don't scale" argument.
If that were true, we wouldn't have Wikipedia either.
Replication is not enough, Competition only wins if it offers lower cost or better service (or intolerable service if free), While yes, the userbase is essential, you're still ignoring the reason why the userbase is there in the first place services before github existed and github is the one that ended up winning, competition cannot just offer a better ethical stance and its not even that, since github itself is not doing anything criminal, it's simply aligned with microsoft, so the ethical stance is "I don't like AI" and "I don't like microsoft", that is simply not enough of an offer to make the entire userbase switch. the only way you could is if github decided to throw all of its userbase like bitbucket did, and given that its name is git, I doubt they'll ever do that.
To clarify, I think it's fair to say "I use GitHub because I don't think MS is that bad" (I disagree, but it's at least a consistent view.)
I only take an issue with "I think MS is morally reprehensible but everybody uses it so I'll keep using it too" because it's a self-fulfilling prophecy. Most people looking for code hosting will use whatever they first run into, so when you choose a host for your project you are also directly channeling users towards said host. It's your responsibility to pick a host for your projects that isn't evil by your standards, whatever those may be.
Not even in the tech world. Microsoft did more than its fair share of cutthroat business practices, but there are tech companies out there that are quite literally thriving on worker exploitation.
Belgian Congolese tire companies under King Leopold, Atlantic slave trade companies, Basil Zaharoff and Vickers who sold machine guns to both sides agitating WW1, IG Farben who did the Nazi gas chambers, Shell oil who since the 1970s continues spending billions funding climate change disinformation and billions preparing for it, at the same time... It's a long list.
Take Hyundai, a brand I drive, childhood slave factories, in the 2020s... You can't even brush your teeth without the ghosts of slavery.
Reasons 1, 2, and 4 convince me the most. It's insane how slow and cumbersome github's code review page has been is ever since they moved from rails to react.
It's an informative post but I really dislike the language and style that are becoming common in this kind of posts, e.g.:
> Look, first of all, you’re as unique as the other 1000 peanut gallery enjoyers that have made the same astute observation before you. Congratulations. But you’re absolutely missing the point.
Why do we hold calculators to such high bars? Humans make calculation mistakes all the time.
Why do we hold banking software to such high bars? People forget where they put their change all the time. Etc etc.
reply