I've been working for a while on a concept I call "mega-libraries."
The idea is that a code library (say, for instance, a string library) should contain a minimum of, say, 10,000 functions.
What you want to do is apply library science (the geeky librarian kind) to make it reasonable to work with so many functions. Apply everything we know about indexes (I'm not talking about the Google kind,) apply everything we know about mapping contexts, apply everything we know about taxonomies, and apply everything we know about tagging and group behaviors, to make these gigantic libraries.
At the end of the day, you should be able to say things like "the function that gives me the extension of this path," and be pretty damn sure that it's going to be in there. (Somewhere.) So much of our programming is given to interpolation. For example, we go, "what's a function that breaks apart the parts of the path?" -- and then, from there, go, "Ok, give me the last part of that." So we're interpolating from what we have, to what we want.
But here's the thing -- "elegance" and "minimalism" in the layout of the API is there mainly because we believe that a human has to read documentation, that a human has to understand everything the API has to support.
But that's not the case. We have the disk space. We should just go, "Give me the function that gives the extension of the filename," -- and have that function.
"How do you construct such a mega-library?"
Well, take 10 string libraries, and push all their functions into one gigantic mega-library.
The Microsoft people are famous for string functions like ".Left" and ".Mid" and such. This goes back to the 1980's.
Python has string functions that work on negative indexes and spans from character index to character index.
OK, put those all in one library. You don't have to choose one or the other; Just put them all in there.
Throw in all those functions for various wiki formats -- that can take wiki-like text, and convert it to HTML. Just throw 'em all in there. These codes are like, what, 20 KB? If you had 1,000 of them, you've reached, what, 20MB? Who cares?!?
Throw all the automated tests in the megalibrary as well.
Detail code, and arguments, and dependencies. Multiple versions. Network connectivity. Configuration grids. Samples. Whatever. Just throw it all in there.
Just don't make us interpolate. Just let us say, "I want to do X," and the function be there.
...
There's another thing we can do, too.
Go in the other direction -- instead of Mega-Libraries, make Micro-Frameworks.
These are miniature "frameworks," exemplar code configurations that can be used and reused across systems.
A full function definition, ready to be copied and pasted into your code. Except it's not just copy-and-paste: You can check boxes to get different features of the system to be in use. You can rename the arguments or the types, and the changes to these labelled bindings ripple through the pasted code.
The code will be exemplar. It will "just work," without debugging. It'll be beautiful, and all exceptions will be caught.
...
What binds all these things together is the library science, though.
We need the visual systems of arrangement and contextualization. We need the intelligent tag networks, the taxonomies, the folksonomies, the centers and the ghettos.
I could just as easily have written about how programming is going the way of data navigation and opinionization -- the end result is the same: Complexity, Intelligence, and Ease.
I used to think that, "The more complex the system, the harder it will be to work with." I suppose there's some truth to that, but it's not the overwhelming truth that I thought it was. A lot of times, complexity makes things far easier: I can't remember the last time I consciously dedicated that much effort to walking around the house. When things go wrong, complex systems point out to me what's going wrong, so I don't have to spend much time in diagnostics (so to speak.) Complexity can be cheaper, free-er, and easier than "simple," by which we so often mean, "stupid."
10,000 functions in the string library. Why not?
If you don't have space on your tiny computer, just offload the library to the network, and suck down the parts you need.
Have a complex dependency analysis program, or better yet, some software that can figure out how to compile exactly what you need, with just the parts you need. Why not?
Data will eat programming.
The complex algorithms will be described, described, and described. The platforms will be described, described, described. The descriptions will be compiled into executables for any platform.
Dictionaries will conquer brittle to-do lists.
I don't want to import modules. I just want to say, "find this functionality in the Internet hive-mind, and adapt this data to work with it."