The Technical-Social Contract
We think we understand the rules of commerce. Manufacturers and sellers advertise; we buy or not, as we choose. We have an intuitive understanding of how advertising works, up to and including a rather vague notion that advertisers try to target "suitable" customers. Similarly, manufacturers and sellers have an understanding of how people buy and use their products. However, technology has been changing what’s possible for all parties, frequently faster than people’s assumptions. This mismatch, between what we expect and what is happening, is at the root of a lot of high-tech conflict, ranging from peer-to-peer file-sharing to Apple’s iPhone.
Let’s look at the iPhone first. People who have purchased one feel, not unreasonably, that it’s theirs and they can do whatever they want with it, from making phone calls to tossing it into a blender. However, Apple has a different view. They think they’re selling a device and a service, and that the requirements of the service preclude user control of the device. It’s not that Apple is necessarily wrong in their approach (though I think so); rather, it’s the mismatch between consumers’ expectations and Apple’s plans that has caused trouble. The result? Dueling news stories about iBricks versus newer, better "hacks".
The converse happened with peer-to-peer filesharing. Consumers were the ones to adopt new technology — digital distribution of songs — while the music industry was holding back. This isn’t a new reaction by the music industry to changing technology. Twenty years ago, the RIAA tried to persuade Congress to mandate Copycode, but tests by the National Bureau of Standards (now NIST) showed that the scheme didn’t work. In 1942, in an incident that is now largely forgotten, the American Federation of Musicians went on strike for royalty payments from the record industry; this incident is generally known as the Recording Ban of 1942.
Fundamentally, these incidents are all the same: people had a mental (and sometimes legal) model of what was "normal" and possible; technology changed, and one party’s behavior changed with it, to the shock of the other. The reaction was the same, too: effectively, the creation of a new crime or form of misbehavior I call "felony interference with a business model".
The same dynamic has hit the Internet advertising market. Ad blocking on the web is catching on — after many years of not doing so — so we see accusations that using it is tantamount to theft. But note the explanation given here:
No flashing whack-a-mole banners. No highly targeted Google ads based on the search terms you’ve entered.to say nothing of pop-ups and pop-unders.
What happened? Both parties found their expectations weren’t met. Advertisers felt that ads modeled on newspaper ads weren’t effective, so they used technology to enhance ad visibility. Consumers resented the distraction, annoyance, and bandwidth consumption; some have resorted to ad blockers.
Turning again to the iPhone, we see that one party — Apple — is trying to use technology to tilt the balance in its favor. This is resented even by people who have no desire to switch phone carriers or to install non-Apple applications, because they correctly perceive a shift in their model of the world.
In one sense, there’s no need to panic: there are always shifts. People and institutions are remarkably flexible over time. Consider "traditional" newspaper ads — which used to appear on the front page. The danger comes with enforcement of the existing model, whether by law or by technology. That sort of thing freezes innovation, by blocking technologies that threaten today’s behavioral model.
From my perspective, Apple’s attempts to lock down the iPhone via repeated updates won’t work; people will break through each new locking technology. Strange excuses for their behavior will be seen as just that: strange excuses. If nothing else, the market will have the final say; if people really want a freer device, someone will build one that’s even better than the iPhone, though perhaps not cooler.
The danger will come if Apple succeeds on the legal front, via the DMCA or the like. Technologies change, which upsets people; eventually, people adapt, and a new norm is reached. We have to avoid artificial interference with that dynamic.
Screendump: #1 in a Random Series of Messages You Shouldn't See
While trying to visit a baseball web site, I saw this:
Now, I don’t blame the server for being unhappy, since I was trying to look at the score of a Mets game, but let’s look at it more closely.
Why does a config file have a login name and password? It turns out that that’s a documented feature — or rather, misfeature — of .NET. Some part of the server needs to invoke another subsystem with different privileges; this is a documented and often-recommended way of doing it. It’s dubious, from a security perspective, but in many cases it’s necessary. More precisely, it’s often necessary to store credentials in some file — but why in a configuration file? Why not let the the configuration file — a file about which you may want to display diagnostic messages — simply point to another file that contains nothing but the password? (I should add that ’sportsrus’ is a very bad password…)
Beyond that, why is the account that’s being invoked Administrator? That’s the all-powerful, privileged account on Windows systems. The principle of least privilege says that applications should run with as few privileges as possible. Is it really necessary to gain all privileges here? Why?
Finally, why is the detailed error message being displayed to users? There’s nothing I can do with the information. Certainly, write it to the log file. Probably, tell the user there’s a system error. But the details are useful to end users if and only if they’re official system testers. That should have been disabled on a production system.
This is Disgusting
This is disgusting. There is no excuse for it.
It is worth noting that Teachers College is access-controlled: you have to show a university ID card to enter. While that certainly isn’t foolproof security, it does suggest that the perpetrator was likely a member of the university community.
The Proper Benefit of an iPhone Design Mistake
There are a number of dubious design decisions in the Apple iPhone and iPod touch. As I wrote earlier, the most serious of these is the apparent intention to make these devices purchasing appliances rather than networked computers. I’m tempted, in fact, to label them iProfits.
From a security perspective, though, there’s another problem: everything runs as root. That is, every application runs with full privileges; if any application has a security hole — and there have been many of them — the attacker has complete control over the device. It is, frankly, rather unbelievable that Apple made such a mistake. Microsoft effectively did this with every version of Windows up to Vista, but at least they had the excuse of backwards compatibility. It almost justifies Apple’s claim that excluding other applications is necessary for security, save that Palm Pilot has always has always behaved that way.
There is a silver lining, though. Running as root has one major advantage: root can switch to other userids. This would permit each application to run as a separate userid, thus separating each one from the others. It’s a solution I’ve been advocating for years. Microsoft has done something similar with Internet Explorer 7. Will Apple follow suit? It would be a good way to benefit from a serious misfeature. Of course, they have to separate their own applications that way, too; they’ve certainly had their share of security problems.
Update: Apple has just announced that in February, they will offer a software development kit (SDK) for the iPhone and iPod touch. This is very good news. However, the note speaks approvingly of Nokia requiring applications to be digitally signed by "known developers". This conflates authentication — who wrote or published the code — and protection. They’re not the same. At best, authentication tells you whom to sue after the fact. What’s really needed is a strong security architecture that prevents nasty things from happening.
It will be interesting to see what happens if Apple does decide to use digital signatures. What will the criteria be for obtaining a certificate? Will certificates need to be renewed frequently, effectively forcing users into a software rental model? Is this the iPhone or the iProfit? (There are some good observations in the New York Times Bits blog.) I’ll post more when we know some details.
Comcast Apparently Blocking Some Peer-to-Peer Traffic
The Associated Press reports that Comcast appears to be blocking some peer-to-peer traffic. Specifically, they appear to be forging reset packets on upload traffic for protocols such as BitTorrent. This is a very dangerous trend and needs to be stopped.
The central question is who determines what runs on the Internet, end system owners or ISPs. Traditionally, the Internet has fostered the "smart host, dumb network" model, and it has succeeded brilliantly. Rather than innovation being controlled by a small number of providers — and for consumers, at least, the economics favor local monopolies or duopolies — the smart host model draws on many small entrepreneurs and technologists from around the world.
Is copyright the issue? BitTorrent is partnering with content owners, including Fox, Paramount, Warner Bros. and MGM. Besides, Comcast is not a law enforcement agency. This isn’t a simple semantic complaint; when one is dealing with the legal process, there are guarantees of due process and an opportunity to contest the charges.
They may be concerned about bandwidth consumption. This is a legitimate concern, especially since the technology of cable ISPs makes upstream bandwidth more expensive. In that case, though, the remedies are first, to tell customers — per the AP story, Comcast is not saying precisely what its policy is — and second, to use traffic-shaping rather than simply sending resets. Traffic shaping addresses the real problem (overconsumption of expensive upstream bandwidth) without choking innovation.
More on Comcast Blocking Peer-to-Peer Traffic
Comcast has finally made some statements on what they’re doing to peer-to-peer traffic. Briefly, they likened it to a telephone busy signal: when the network is too busy — though they won’t say what their criteria are — some connections are interrupt. It’s supposedly almost transparent, since "the software automatically tries again".
That won’t fly. Stating that the software will retry assumes a certain model of software. Perhaps some particular clients will retry. Others may not. The semantics of a TCP Reset are quite well-defined; there’s even an Internet Best Current Practice that warns against other inappropriate TCP Resets.
In my earlier post, I noted that Comcast did have one legitimate concern: upstream bandwidth is expensive. According to some reports, though, that isn’t what Comcast is trying to conserve. TorrentFreak notes that the technology "breaks every (seed) connection with new peers after a few seconds if its not a Comcast user." In other words, you’re allowed to consume the expensive upstream bandwidth; however, you’re not allowed to use too much of Comcast’s connections to its peers. Eric Rescorla notes that this is "a pretty attractive way to reduce network traffic without overly annoying too many of your users." It’s also reminiscent of Comcast’s 2003 attempt to redirect web traffic through a "transparent" proxy. They quickly gave up that behavior, though the major concern expressed at the time was customer privacy.
It’s worth noting that others have used TCP Resets to block traffic they don’t want. The most notorious offender is China, which uses Resets to implement its Great Firewall. Is this the model that Comcast wishes to emulate?
"Do Not Track": All or Nothing?
According to
press
reports,
some Internet marketing companies are starting a "Do Not Track" list,
a way to opt out of tracking cookies. It’s a good idea, but there’s
a downside: using this scheme can hurt your privacy unless you’re
very careful.
Internet marketers typically track people via
cookies. A cookie is a small amount of text stored on your
computer by a web server; it lets the server know you’re the same
person (more precisely, you’re using the same web browser) who visited
the site some other time. Some cookies are used to track your preferences,
such as what sort of web sites you visit or articles you read; this is
used to tailor ads to your (perceived) interests.
One of the best explanations of cookies and advertising can be found at
Doubleclick’s
FAQ.
(Doubleclick’s
privacy
policy disclosure
is one of the best out there.
This is quite ironic, since years ago they were
roundly criticized for their privacy practices.
On the other hand,
very few people know to check that site, since most people don’t even
know it exists.)
You can see how this works by connecting to
my cookie test server,
which I’ll leave running for a few weeks.
A typical "Do Not Track" option works by letting people download
a special cookie. Doubleclick’s
opt-out
service does just that:
Presumably, the AOL version would be more complex, because it will let
you specify your interests. That is, it’s intended to permit targeted
advertising but without tracking.
The problem is that today’s best way to avoid tracking — regularly
cleaning out your cookie collection — will delete the "no-track"
cookies. (Doubleclick even warns about this.) Users will thus be
faced with a choice: defend against everyone, by frequently discarding
all cookies; defend against the more responsible marketers, by using
their no-track cookies; or trying to remember to be selective about
discards and/or recreating many different no-track cookies very frequently.
None of these options sound appealing.
Update: a
New
York Times blog
has noted the same problem. It refers to some technology developed
by Tacoda to permit preferences to
persist even if cookies are deleted. It isn’t clear to me what that
technology is; Tacoda and its subsidiary,
Advertising.com have web
pages on cookie-based opt-out. Perhaps it uses a
Flash cookie?
Flash cookies are just about as useful for tracking people, and they’re
seldom deleted because most people don’t know about them.