smart common lisp compilers are orders of magnitude faster than cpython without annotations and can be in the same ballpark as c or fortran with annotations, which means that you can do your numeric work in lisp itself, without calling out to any kind of foreign libraries.
it is not feasable to do numerical work in python itself, in reality python acts as a construction kit dsl for allocating and manipulating foreign (c and fortran) structures. this is usually sufficient as one can see by the massive success of libraries like numpy. when you are writing novel numeric code though, you have to figure out how to match it to the numpy's model. this is also not usually a problem, since matching it to numpy's model usually makes it architecturally performant. but one is still wearing a kind of straight jacket at the end of the day.
when i was doing computation chemistry, i wrote a lot of code from scratch in fortran and lisp, and it was entirely feasable to do mainloops purely in lisp, implementing algorithms close to their paper versions.
there are all kinds of aspects of lisp that make it pleasant to work with in computational science area, but this is already a tldr. i'll mention one, it's possible to rig your code in a way that a long running batch process will not lose its state without much code overhead. since lisp lets you recover from error without unwinding the stack, you can trust that after hours or days of computation an error is not going to cost you full progress loss. often times you can redefine the offending part of code, and continue processing.
* Julia has multiple dispatch like in CLOS, except generic functions can devirtualize their arguments, and can be inlined, making the composition of many small function calls significantly faster.
* All functions (except a dozen or so internal builtins) are generic functions which can have methods added to them, and all objects can be dispatched on.
* Objects can be isbits and allocated inline in an array or stack allocated without any pointer indirection.
* Julia's type system is parametric, and things like Array is parameterized on it's contents, meaning that you can dispatch on thing like Array{Int} as a distinct type from Array{Quaternion{Float64}}.
There's lots of things Common Lisp does really well, but I really do think in the niche of numerical computing, Julia just blows it out of the water for performance and also ecosystem size / vibrancy.
I would say that Julia's great advantage is following Dylan's footsteps, being a Lisp without parenthesis, for those that can't get their head around them.
From the point of view of code generation, its JIT takes advantage of being built on top of LLVM's optimisations.
http://planet.lisp.org/ is the canonical lisp revival place to announce lisp related things. a very small subset of lispers hang out on places like reddit, discord or telegram
it's fukamachi-ware, guy has a bunch of own solutions in the web development space, i suspect driven by his specific needs. they are, fwiw, not as idiosyncratic as some of the artisanal common lisp one-man solutions can be, but they are distinctly and recognizably their own thing
I use his web stack in some of my projects. His stuff is pretty great, it's just that he basically doesn't write any documentation at all, and the small scraps of documentation he does write assume that you're intimately familiar with the details of his packages. So you need to figure a lot out on your own by digging through source code, which is very irritating when you're trying to be productive.
faré the guy who briefly took over asdf development before quitting common lisp entirely to do scheme, or whatever he's doing right now, spent significant amount of time turning asdf from a 10kb file into 3000kb file in order to solve problems that only he himself cared about, or possibly only google cared about because faré worked for google at the time (building portability library that ignored and force-superseded existing solutions, and trying to eliminate corners cases of SAT solving). there was a period where he was harrassing (this is a subjective interpretation of the events, but this is my side) implementors to incorporate his recent asdf changes, that were breaking left and right to the point that multiple common lisp maintainers of multiple common lisps were considering pulling asdf from their repos entirely. things have stabilized, because faré quit and there's no longer volatility. we're at a point where there's just no concensus about relying on asdf for anything, and while some people go all in on 30% of available functionality (as opposed to customary 5%), equally many people, the majority, use it as a glorified (load)(load)(load). this is an unfortunate side effect of activist developers in small communities: they can have outsized long term effect on how things are done.
In the Meta-CVS project, I carried a copy of asdf.lisp in which the package was renamed to zxcv, to isolate from the ASDF churn and be able to run my version of ASDF possibly with my fixes even in a Lisp that has ASDF preloaded.
ASDF is the only build system I know which broke on me when I tried to fool it with a link farm (lndir command) into building in a separate directory. It resolved the paths using the truename function which eliminates symbolic links, and then calculated the .fas file names from the canonicalized paths, thereby depositing compiled files into the original directory rather than in the symbolically linked tree.
ASDF also depends on symlinks for external organization. .asf files are often symlinks to the real location, and without these symlinks, things don't work.
i have a patched version of asdf 1.369 which was maintained by gary king, with contributions from luminaries like nikodemus, chrodes, weitz, pvaneynd and kevin rosenburg. i've patched it to support the recent extensions that people seem to like (extra parameters to defsystem, like author). you can drop it into your implementation and quicklisp and it just works. asdf 2, asdf 3 happened since, they solve some kind of problems, i guess whatever google needed to do with its fleet, or maybe it's "easier to maintain", but it's not the kind of problems that surface in like 60% of quicklisp packages.
Could you distribute it? I'm not a huge fan of ASDF complexity and have had many headaches trying to make some custom components work. I would welcome any simple alternative, even though I will probably end up writing my own.
it is not feasable to do numerical work in python itself, in reality python acts as a construction kit dsl for allocating and manipulating foreign (c and fortran) structures. this is usually sufficient as one can see by the massive success of libraries like numpy. when you are writing novel numeric code though, you have to figure out how to match it to the numpy's model. this is also not usually a problem, since matching it to numpy's model usually makes it architecturally performant. but one is still wearing a kind of straight jacket at the end of the day.
when i was doing computation chemistry, i wrote a lot of code from scratch in fortran and lisp, and it was entirely feasable to do mainloops purely in lisp, implementing algorithms close to their paper versions.
there are all kinds of aspects of lisp that make it pleasant to work with in computational science area, but this is already a tldr. i'll mention one, it's possible to rig your code in a way that a long running batch process will not lose its state without much code overhead. since lisp lets you recover from error without unwinding the stack, you can trust that after hours or days of computation an error is not going to cost you full progress loss. often times you can redefine the offending part of code, and continue processing.