DLL Hell – Same as it Ever Was

Everything old is new again.

How it started (1998):

And all it takes is a single DLL, VBX or OCX to be missing, or present in an older version (or even an incompatible newer version), for an application to fail. A poorly designed installation program, user error, registration error or change in the user’s PATH environment variable are a few of the ways in which this problem can occur.

https://www.desaware.com/tech/dllhell.aspx

.Net – our savior (2001)?

.NET’s versioning features promise to do the most to eliminate the DLL Hell syndrome

https://www.techrepublic.com/article/introducing-the-assembly-a-cure-for-dll-hell/

Yeah no (2021)

DLL Hell is an old term that got a new meaning in managed runtimes like .NET. The original DLL Hell issue was that many applications shared the same DLL file dependency. Then, if one of those applications updated that DLL, it broke the API for the other applications, and caused all sorts of trouble.

In .NET we don’t have that problem. In most cases, applications don’t share DLLs, which prevents issues in the first place. When apps do share libraries, they use the Global Assembly Cache (GAC). This is a place to share libraries on the machine, but it’s only for strong-named libraries. When an application uses a library from the GAC, it requests a specific version, and the strong name guarantees it will get that exact version.

But if you think this architecture solved all our problems, you will be disappointed. We still have problems, just different ones (emphasis mine RH).

https://michaelscodingspot.com/dotnet-dll-hell/

We still have problems, just different ones…sums up a lot of “progress” in software development.

Cloud Skeptic #0

Split your codebase, split your teams, create a lot of opportunities for mediocre coders to grow into mediocre engineering managers, everybody was happy. Including your hardware vendor, because suddenly you needed much more hardware to run the same workloads… The feedback cycle is truly broken – testing a microservice is merely testing a cog in a machine and no guarantee that the cog will fit the machine – but we just throw more bodies at the problem because Gartner tells us this is the future.

Cees de Groot goes Back to the 70s with Serverless.

And the Rule 17 Lifetime Achievement Award goes to…COBOL

The sheer age of those COBOL systems is, oddly, actually something that works in their favor. Because they’re old, they have been relentlessly debugged. When a program is first written, it inevitably has problems…But those COBOL programs that run the world? They’ve had decades for coders and users to uncover all the problems, and to fix them…They’ve been debugged more than just about any code on the planet. This idea — that older code can not only be good, but in crucial ways superior to newer code — is at odds with a lot of Silicon Valley myth making.

Legacy == Proven.

Clive Thompson on the enduring fitness and necessity of COBOL https://www.wealthsimple.com/en-ca/magazine/cobol-controls-your-money