Hacker Newsnew | past | comments | ask | show | jobs | submit | rademacher's commentslogin

Moral of the story, the key to happiness is low expectations.


I haven't read this paper yet so I can't speak to it's quality but it appears to be addressing the same questions in this post. Bengio is a coauthor so maybe that's a good sign . Here's the abstract.

This paper provides theoretical insights into why and how deep learning can generalize well, despite its large capacity, complexity, possible algorithmic instability, nonrobustness, and sharp minima, responding to an open question in the literature. We also discuss approaches to provide non-vacuous generalization guarantees for deep learning. Based on theoretical observations, we propose new open problems and discuss the limitations of our results.

https://arxiv.org/abs/1710.05468


I suggest this reference [1] for anyone needing to look something up.

[1] https://www.math.uwaterloo.ca/~hwolkowi/matrixcookbook.pdf


The problem is high dimensions knowing the distribution or even characterizing it fully with data is incredibly difficult (curse of dimensionality). I think the real assumption in ML is just that there is some low dimensional space that characterizes the data well and ML algorithms find these directions where the data is constant.


Julia only hit v1.0 in 2018. It's doing pretty well for being so young in my opinion.

All the arguments against Julia are basically that python has a lot of momentum and it takes time and effort to switch to a new language. I think Julia should really seek to displace MATLAB as a near term goal .


> I think Julia should really seek to displace MATLAB as a near term goal .

Have you studied MATLAB and its ecosystem? It ranges from real time control to image recognition to sophisticated engineering-specific toolboxes (RF, 5G, LTE, etc.). They also have proprietary algorithms that do things no Julia package can do. Sure, there is PDE solvers and the ecosystem is growing for Julia, but it is at-least an order of magnitude smaller than MATLAB if not more. I urge you to explore the documentation, APIs and toolbox details on MATLAB: https://www.mathworks.com/products.html

Take a look at the LIDAR toolbox for example: https://www.mathworks.com/products/lidar.html

or LTE: https://www.mathworks.com/products/lte.html

and I've used DSP toolbox the most: https://www.mathworks.com/products/dsp-system.html

I am not condoning use of MATLAB, just stating the facts having used Julia and MATLAB extensively. I personally like Julia language FWIW. At work, we use MATLAB and happily pay for it. Their support is absolutely top-notch and for us it is the reason alone to use MATLAB. Julia has support but not even close to MATLAB's direct line to compiler engineers (yes, I've had them fix a bug and do a release in an afternoon).


You're absolutely right. I'll agree with the person you replied to as well though: displacing (at least some portion of) MATLAB use cases really would be a good goal for Julia.

Personally, I managed to switch over 100% and DifferentialEquations.jl is the reason that made sense for my work.


Julia has some packages to address this now.

https://github.com/timholy/Revise.jl


SAR has been around since the 70s and works using the same principles as CT scans (projection slice theorem). Think about observing an image by rotating it and projecting it orthogonally to the rotated direction. Take a bunch of these measurements (basically a radon transform) and then you can invert the process using the back projection algorithm.

The resolution of the image is proportional to the bandwidth of the waveform and the distance traversed by the satellite during the collection process.

This article is a bit of an exaggeration and Capella is certainly not the first or only SAR service.


Typically when you downdsample you're going to want to filter than use whatever downdsample kernel you want with the correct stride. Since the filter is lowpass, think just Fourier transform then taking an inner smaller square of the image and inverting, then you can embed the poison image only in that frequency spectrum. Now by playing with the power, if we downdsample by a factor of 4 then just assume that we lose a quarter of the power in the original image while the poison image loses no power. So right off the bat, we are scaling up the poison image power by a factor of the downsampling ratio. For example, we might go from 1/4 power in the poison image relative to the true image then to equivalent power. The other aspect would be if the interpolation kernel and strides are known we can just make sure that the poison image has large values at those specific pixels and further increase the gain.


I think this suffices as a summary, "The other reality is the frustration and drudgery of operating in a world of corporate politics, bureaucracy, envy and greed— a world so depressing, that many people quit in frustration, never to come back."

Some of us just aren't cut out to work in a big corporate environment. From what I've seen, large technical companies are made up of two sets, the technical set and the manager/business set. Unfortunately, it seems that the manager set yields a disproportionate amount of influence and power and therefore is "valued" more. I'm sure there are smaller companies that could make the folks leaving stick around the industry. But, if they've been successful and are mid career they may have priced themselves out of those opportunities.


"Some people aren't cut out for it" explains nothing and is an unfalsifiable statement. It's simply restating what's happening, but in a way that removes empathy.

As someone with 10 years of management experience at big corporations, I encourage you to consider the possibility that managers are not valued more and this attitude, itself, is evidence of the beginnings corporate burnout.


There is another essay on that series from the same author (and linked at the beginning of that one) about this question. Is called how to get promoted but slap converts what you’re talking about regarding managerial vs technical value.


I was expecting this to be more about writing fast matrix multiplies. Surprisingly, it ends up being a lot more than just a triple loop and there's a reason that everyone uses existing libraries like OpenBLAS and MKL [1].

[1] https://www.cs.utexas.edu/users/flame/laff/pfhp/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: