Blog Articles 131–135

5 languages for teaching

Rosetta Code is asking for 3-5 languages for teaching orthogonal paradigms. I’ll bite (warning, I’ve spent all of about 15 minutes thinking about this list):

  • Standard ML (or OCaml, if more practicality is desired): functional programming and strong typing, in the simplicity of a H-M type system. Haskell and Scala’s type system extensions are fun, but pedagogically it seems useful to teach in a simpler environment first.
  • Forth, or perhaps Factor: stack-based programming has similar underpinnings to functional programming while feeling wildly different. And preparing the class would be an excuse to more deeply understand Forth. In the historical spirit of Forth, we’d learn it by building it, so some assembly and machine architecture (likely ARM) would be included as well.
  • Java: a “standard” object-oriented language, industrial-strength programming environment. Imperative programming. Design patterns.
  • JavaScript: dynamic language with prototype-based objects. Of the languages I’ve worked with, JavaScript seems to most deeply embody what it means to be a dynamic language without letting you rewrite the language from the inside. There are limitations, to be sure; you can’t make e.g. builder DSLs in it. But at its core, it takes well-worn PL concepts (objects, closures, etc.) and makes them thoroughly dynamic.
  • Oz: declarative programming, dataflow concurrency, and the wonders of having unification as a language primitive.

There are a number of languages I wish I could include (Common Lisp, Haskell, and C, to name a few). This list also has a heavily abstract-programming/virtual machine slant, with the exception of Forth; there isn’t much that runs close to metal, or even exposes the C/POSIX layer very much. That is, I will admit, a weakness. If I were to add a 6th language, it would probably be C or Perl, to get at procedural programming in a Unix-like environment.

Lessig Blog, v2: Prosecutor as bully

A link to share!

lessig:

But all this shows is that if the government proved its case, some punishment was appropriate. So what was that appropriate punishment? Was Aaron a terrorist? Or a cracker trying to profit from stolen goods? Or was this something completely different?

Early on, and to its great credit, JSTOR figured “appropriate” out: They declined to pursue their own action against Aaron, and they asked the government to drop its. MIT, to its great shame, was not as clear, and so the prosecutor had the excuse he needed to continue his war against the “criminal” who we who loved him knew as Aaron.

Here is where we need a better sense of justice, and shame. For the outrageousness in this story is not just Aaron. It is also the absurdity of the prosecutor’s behavior. From the beginning, the government worked as hard as it could to characterize what Aaron did in the most extreme and absurd way. The “property” Aaron had “stolen,” we were told, was worth “millions of dollars” — with the hint, and then the suggestion, that his aim must have been to profit from his crime. But anyone who says that there is money to be made in a stash of ACADEMIC ARTICLES is either an idiot or a liar. It was clear what this was not, yet our government continued to push as if it had caught the 9/11 terrorists red-handed.

We need to ask these questions of much of our justice system. Disporportionality of justice (or at least the surrounding situation) seems to be a contributing factor in Swartz’s suicide; how many others are dead, or locked up with their families in tatters, because the U.S. culture of justice (both in official agencies and society at large) has forsaken balance?

On STL

While in the hospital, in the state of delirium, I suddenly realized that the ability to add numbers in parallel depends on the fact that addition is associative. (So, putting it simply, STL is the result of a bacterial infection.) In other words, I realized that a parallel reduction algorithm is associated with a semigroup structure type. That is the fundamental point: algorithms are defined on algebraic structures. It took me another couple of years to realize that you have to extend the notion of structure by adding complexity requirements to regular axioms. And than it took 15 years to make it work. (I am still not sure that I have been successful in getting the point across to anybody outside the small circle of my friends.) I believe that iterator theories are as central to Computer Science as theories of rings or Banach spaces are central to Mathematics. Every time I would look at an algorithm I would try to find a structure on which it is defined. So what I wanted to do was to describe algorithms generically. That’s what I like to do. I can spend a month working on a well known algorithm trying to find its generic representation. So far, I have been singularly unsuccessful in explaining to people that this is an important activity. But, somehow, the result of the activity - STL - became quite successful.

— Alexander Stepanov, in an interview on the origin and design of the C++ STL. This is a deeply profound way to approach programming and algorithm design, and is also at the heart of what the Haskell community has been doing for some time now. Seriously, if you’ve wondered why the Haskell library is riddled with Arrow, Category, Monoid, etc., it is to support exactly this mode of thought. A function declares exactly the type of data it requires, in terms of its necessary operations, and operates on any data matching that requirement. Incidently, this article combined with a lab discussion earlier in the week sparked the revelation that a lot of C++ template design patterns are an impenetrable implementation of type classes.

Open source is not a set of rules…

Open source is not a set of rules waiting to be gamed by corporate lawyers and lobbyists. It’s the pragmatic embodiment of an ideal called software freedom, based on the understanding that the flexibility to use, study, improve and share software is the essential dynamic of the new meshed society.

— Simon Phipps in an article on the incompatibility of FRAND patent licenses and F/LOSS. Not the core point of the article at all, but I think this bit nicely captures the fundamental synergy and compatibility between free/libre and open source software.