aleph3: (ants)
This Saturday last I was at BarCamp Madison #3, which was a fine experience. If you want the high-level overview of what it is, the Wikipedia entry is a good start; if you want a high-level overview of what one looks like, @raster and Time Lapse Bot cover it in 3 odd minutes.

Among the attendees were representatives of several hacker-spaces: Noisebridge in SF, HackPittsburgh, and Sector67, which is just getting off the ground in Madison. ("Hacker" here, just to underline, is in the sense of "tinkerer" and "maker", whether with hardware or software - though there is a bit of a bias towards hardware in the movement, simply, I'd guess, because pooling resources opens up a lot more avenues more quickly to the hardware hacker than to the programmer.) They gave a joint presentation about the movement which was pretty damn awe-inspiring.

In large part just because of the things people have done with this kind of grass-roots, non-profit collaboration and sharing of capital: launching a space balloon; making serious progress towards self-replicating assemblers; and a host of smaller-scale coolness, like cocktail-making robots.

And in equally large part just because of the co-operative nature of the endeavour. Mitch Altman from Noisebridge - who would deserve one of those Benefactor of Humanity statues merely for his creation of a remote which can turn off TVs in public places - laid down his own cheerfully-self-admitted hippie anarchist take on it: "We have no leaders," he said, "and one rule: be excellent to one another. No matter where you are in your life, you can think of a way to make it just a little bit cooler." Wrapping up, he said "If you think you'll ever be in San Francisco, let me know, and I'll cut you a key so you can visit Noisebridge."

That emphasis on the calculus of coolness is a good thing to hold onto, I think, when things get rough. If the first derivative of your life is in the direction of more coolness, in any dimension, that's an achievement.
aleph3: (Default)
A couple of weeks ago now I pitched in to help with the annual inventory at my wife's bookstore. The end product is actually an abstraction of inventory: they don't need titles, or even a count of books, just the total cover price.

This is done by several pairs of people, one person to iterate over the shelves in a section and call down prices, and another to total them on a printing calculator. I naturally had an initial reaction of "can't a general-purpose computer - a laptop with a spreadsheet or a mobile device - do this just as well?", which I soon withdrew, having observed:

- the calculator effectively throttles the flow of information to a manageable pass, by sending a very audible signal that the last number has been entered
- the calculator has a very visible audit trail of all actions, even errors and the recovery therefrom

Most shared activities involve a certain amount of coordination, but the manner of that coordination can have different qualities of efficiency and pleasantness for the participants. For instance, in this case, the person entering the numbers could say "Next" if the operation was silent, but over several hours that gets to be unneeded wear-and-tear on the vocal cords.

(One of my favourite examples of "conservation of coordinating information" is paying in cash between Canada and the US. Canadian bills differ sharply in colour and are hard to mistake for one another; with American banknotes, the differences are subtle where they exist at all. So in the US, I've observed, people tend to hand over bills with "There's (denomination)", and the cashier echoes it back. Having picked up this habit, I now find myself doing it in Canada, and of course I sometimes get looked at as though I've just called someone an idiot.)

Some of the other out-of-band benefits of doing things the current way are:
- maintaining a community of people with some solid connection to the business other than being employees or patrons (full disclosure: we got compensated with store credit)
- running several sets of eyes over the whole store, increasing the chance of catching moved or mis-filed books, weird shit placed on shelves (Chick tracts, or the like), and things like that

And you know, apart from a data-entry app which makes a "ka-CHUNK" sound, I can't really think of a very helpful way to replace the adding calculator. Things in a system just tend to have a whole range of undocumented side effects: a recent classic example being that LED traffic lights don't melt snow the way incandescent lights used to, so municipalities now need to dust them off by hand. I'm never gonna say that the lesson is "don't change anything ever", but higher-order effects always need to be considered. Which sounds easy but is hard. At the very least, though, it means that the people who work with a system directly need a lot of input, and not just their managers and the people holding the purse-strings.
aleph3: (Default)
A few days ago I pitched in to help with inventory at the bookstore my wife works at; "inventory" in this sense meaning adding up the list prices of the items in the store. I'm putting together some thoughts on this as an information system. Before embarking on that, though, I want to throw out a pointer and a small example.

One of the more aha! things I've read was the chapter of Peter Checkland and Sue Holwell's Information, Systems, and Information Systems entitled "The Information System That Won The War", which was about how the RAF organized and distributed (and protected) radar data: and so despite having radar which was not, technically, as good as Nazi Germany's, made more effective use of it. The core take-away, though, being that the only computers involved were rooms full of people who were very good at sums.

The corollary take-away, and the one which I'm always finding examples of, is: a lot of information systems are still made from things rather than bits, and that if you are going to move them to a digital platform, you stand to gain a great many things (scalability, distributivity, speed) but there are also things that a physical system provides seemingly as a side-effect that are actually extremely important to how people use it. And these things get lost easily.

As a very simple for-instance: a deck of cards. Both in school, and in job interviews, I've been asked to implement a deck of cards in various programming languages. This isn't a very complicated task, unless you consider one of the main settings in which a deck of cards is actually used: playing a game of chance with money at stake, a setting in which the incentives for providing a deceptive deck of cards are, let's say, considerable. So in card-games played with real decks there are a number of social processes which leverage the physical properties of cards to promote transparency and trust.

For one, there's simply the thickness of the cards, a familiar quantity: a deck deviating from 52 by a noticeable amount will do so visibly.

For another, there's the uniformity of the back face: which increases confidence that someone working with a face-down deck isn't able to discern the identity of the cards. At any rate, actually flipping and sneaking a peek at one is a very visible act which it's hard though probably not impossible to do under the eye of several other players.

Of course, there is card-marking as an exploit, hence shuffling and dealing conventions have a strong tropism towards showing the backs of as many cards as possible, making it harder for a marked card to pass undetected.

And shuffling itself tends to be showy rather than trying for maximal efficiency: the goal being not just to mix the cards effectively but to demonstrate the mixing to observers.

By contrast, you can have a class in your programming language of choice doing everything a deck of cards does in a half hour, but it remains a black box. When you play a card game online, you essentially have to take the probity of Yahoo or MSN, say, as a given: and indeed not just their probity but the skill of their development and QA team, so that there's no bug which might, if the change to daylight savings time happens mid-game, result in their being two Jacks of Diamonds in a deck.

All the same issues, of course, but with much, much higher stakes, show up in voting technology. The principle remains the same, though: if someone has skin in the game, they also deserve ways to check that it's on the up-and-up, and not just "trust the experts, peasant".
aleph3: (Default)
Recently I re-read Dune; I noted that my tastes have shifted since I was eight, and my main complaint is now "too many knife-fights, and not enough lectures on ecology". Also, I'm frustrated that the series holds up a long-term project carried out by a culture - the Fremen's multigenerational attempt to change Arrakis from an arid world to a moist Earthlike one - and then completely drops it, for several more books of frustrating Great Man Theory.

Anyway the point is that really long-term projects are intriguing. What kind of software project, I wonder, would take a thousand years? That's about twenty times as long as we've even had software at all!

I came up with one obvious immediate answer: homeostasis. For a piece of software to simply continue executing with the same logic would still require constant work: either creating wrappers so that newer systems can interact with it, or actually porting it to new platforms; a task requiring a formidable array of tests, or, if I can indulge a fond hope, a rigorous enough set of correctness criteria that the port's faithfulness can be checked by a theorem-prover. Honestly, if you really want it to survive for a thousand years and still map the same inputs to the same outputs, while still moving it from medium to medium and language to language, I don't see that much short of rigorous proof of at least core components can be avoided. But then, full disclosure: formal methods are my background.

The other answer would be something so enormous that it would actually take that long to finish, and there my imagination fails me. Control system for a Dyson sphere? For a functioning Biosphere-Two-type artificial ecosystem that could sustain a substantial human population? But that's just complexity that comes from being coupled to complex physical systems. What sort of software would just be, of its own right, so complex that it would take a thousand years start to finish?
aleph3: (Default)
Write-up of the Friday 1:20 PM session at DrupalCamp Wisconsin: "Local Economies Mapping and Modeling with Drupal". Sam Rose, the presenter, opened with a quick look at the LansingWiki site, talking about the current state and goals. From there he quickly opened it up to direction from the audience. That led to it being a fairly nuts-and-bolts talk, much more about the mapping than the modeling.

There was talk about module choices: GMap, GMap Views, GMap Taxonomy markers, versus Sam's solution which uses GMap for entering geographical data but displays it using OpenLayers. This sounds like it involves more coding, both in PHP and JavaScript, than using the GMap modules, but the results look good, and apparently grouping things into layers which you can show or hide is much easier. Also, Sam pointed out, pushing local information back into a commons-based resource like OpenStreetMap can help you sleep a little easier than giving it to Google or Microsoft, who reserve the right to get rid of it, or make it private, or monetize it.

Another useful pointer was to MIT's SIMILE project, which is developing a whole set of tools for rendering and exchanging semantic data on the Web, including a JavaScript widget for displaying timelines.

So lots of food for thought! And the stuff about modeling was tantalizing: for instance, they're using the Feeds module to take data from the county about vacant land, and then putting that on a map, so they can track what sort of uses get made of it, or, indeed, make plans to put it to use. Tracking local food production and distribution is a work in progress.

And in general they are working on building a model of the local economy as a network, and then submitting data to a service which will apply economic modeling to look at what-if scenarios. That sounds really groundbreaking, potentially. I wonder if it's actually some sort of network-based economic model, or if they first pour the data into a big vector and do the sort of classic Leontief-style matrix crunching. I'm not very familiar with this field beyond some hand-waving in linear algebra courses - have made a note to learn a bit more.

And a final valuable take-away was the emphasis on making a really long-term plan for keeping your data, which constitutes, in a collaborative site, an awful lot of work and expertise. Not just a "drive backup" plan which would make a sysadmin happy: an archival plan which will make an archivist happy.


aleph3: (Default)

August 2010

29 3031    


RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags