The pen & paper RPGs I run these days are all Old-School RPGs:
Palladium. Of late I have run Rifts and Dead Reign (yay, Romero zombies!), and plan to use the new Robotech books to run my own space exploration setting.
I've been running Palladium games since 1983. My house rules have waxed and waned from 20-page rulebooks to almost nothing, depending on setting and my mood. I do play with one consistent house rule in MDC settings: 1 MDC = 10 SDC, not 100 SDC. So getting hit with a 1d6 MD laser pistol will do 1d6x10 SDC to a human… which is painful but survivable, instead of instant vaporization.
Tunnels & Trolls, 5.5E with Ken St. Andre's house rules; I didn't like 7E. Mostly used when I want to run a comic fantasy one-shot. Pick-up games are always T&T. I once ran a week-long (one session a day) mini-campaign for several players with just the condensed rules in front of the Corgi City of Terrors solo gamebook! Frankly, there's too much fluff and nonsense in even 5.5E, and maybe adding a bit to the condensed rules would be better.
Despite that, I can't quite take T&T seriously. The spells ("Take That You Fiend!", "Healing Feeling", etc.) are ridiculous, the monsters and situations in the gamebooks and fiction are even sillier, traps and saving rolls are very arbitrary and random, and it doesn't handle much outside of fantasy dungeon crawls. Still… I like it for what it is.
Swords & Wizardry. Quite surprisingly to me, my serious swords & sorcery game now.
I started gaming in 1978 with the Eric Holmes blue book Basic Dungeons & Dragons, ran that (and Gamma World and Star Frontiers) until I got Palladium Fantasy RPG. Some years ago, I dug it out, and was kind of impressed, but thought "no way!" Then I ran it again… and liked it, though there were many flaws. Swords & Wizardry addresses many of those flaws, and makes fixing the remaining ones easier.
I have a strong distaste for Advanced Dungeons & Dragons 1E and 2E, and Dungeons & Dragons 3E, but Basic always worked okay for me, albeit with some house rules. I've tried the game called "Dungeons & Dragons 4E", which is a mediocre fantasy-superheroes miniatures wargame, but it's not any edition of D&D, nor is it a role-playing game by any definition I can work out.
My own Nexus Worlds RPG system and setting, which will be published in PDF and Print-on-Demand soon… (and the iPhone game will come out eventually!). While not based mechanically on any specific old-school game, it has their simplicity and player-driven tone.
I've had a couple of good playtest sessions; the old-school players got it instantly, the new-school players floundered a bit until they learned how to play without mechanical straitjacketing. That's an encouraging sign.
Next month, Hackmaster Basic 5E is coming out, which I look forward to.
Hackmaster 4E was an annoying parody game based on Advanced Dungeons & Dragons, but they did recognize some serious problems, fix them, and added some good, flexible rules like the honor and skill system.
HMB 5E is a more serious take on Hackmaster, stripped down to a basic game, with a serious setting, and so far sounds interesting. The Kenzer & Co. people are (not to be mean, but to be blunt:) too conservative, closed-minded, and ploddingly methodical in game design and other issues to ever make a great game, to ever create anything outside of retread D&D fantasy (or retread Boot Hill with Aces & Eights), but they might make a competent retread game. I will certainly not be touching their "Advanced" game, which sounds like all the excessive crap of HM 4E, and then some.
I do play some modern narrative games and some rules-heavy games, but what I run has gone almost entirely old-school over the last 5-10 years.
So, this has made me consider "What is an Old-School RPG".
An "Old-School RPG" does not have to be old, it just has to have learned in the old school, learned from the old games. A "New-School RPG" stormed off to "new school" to make modern art, and ignore everything from the old games. Clearly it is possible to make new "old-school" games: Just take the original old-school games as a guideline, and make a new game that doesn't stifle that kind of creative gameplay. One example would be Savage Worlds; very Old-School, especially in the Solomon Kane book.
Matthew Finch's Quick Primer for Old School Gaming is a good tutorial on running old-school, but doesn't quite make a definition, except to point and grunt at Swords & Wizardry.
There are four traits that define "Old-School RPG" as I use the term:
Character creation should be simple and fast. Making a character for any of my current games takes 5-10 minutes. Most fit a character on an index card.
Palladium's the slowest, and has the most skills and modifiers to write down, and yet it still takes under 15 minutes in the OCC-based games.
Rolemaster, Space Opera, Bushido, and Champions are old, "Classic RPGs", but they're not "old-school", because they take literally hours to make a character.
Combat should be simple and fast and lethal. A combat in an old-school game takes a few die rolls back and forth. Special cases, acrobatic maneuvers, tricks and tactics, are handled by role-play and the Judge calling for task rolls, NOT by a gigantic set of rules defining every edge case (Champions and GURPS were the original offenders here, but Dungeons & Dragons 3E and 4E carried the idea to absurdity). A fight should take 10-15 minutes to play out before everyone is dead, escaped, or surrendered.
The consequences of a fight should take a while to recover from: recovering HP, magic power, repairing armor, and reloading ammo. Old-school combat is about expending resources that take a lot of time to recover, if ever.
Death should be a constant possibility. Without death, there is no heroism. D&D 4E can't have heroes, because they can just fart and get a healing surge, inflict 1 HP damage and a minion explodes into gibbets. It's GOD MODE, just like in Doom! 10 years ago, the makers of SenZar were mocked for having a game where you couldn't lose, but now Wizards of the Hasbro does it, it's okay?
I find that killing 1 player character every 2-4 sessions, and maybe 1-2 NPC henchmen every session, keeps the players nervous and watching their backs. Resurrection should be rare or unavailable. Making a new character is easy.
Action resolution should be hazily defined and left up to the Judge and players to work out. The characters may or may not have skills* for sneaking around, perception, climbing walls, and so on, but the exact requirements and effects aren't defined. The skill and experience and cunning of the players (not the characters), good judgement, negotiation, and role-play determine what effect they have.
* (In games without skills, you just end up using stats, arbitrary "N in 6" chances, or saving throws for everything, which are just skills you can't improve; you may as well add a real skill system)
Advancement should be slow and deliberate, and get slower. Old-school games may give sluggish advancment from level 1 to level 5, then glacially slow from level 6 to 10, then geologically slow at level 11-20. Holmes suggested that going from level 1 to level 2 would take 6-12 adventures. I like to add a few adventures per level per level.
Even as you advance, you shouldn't become too much more powerful than you were; a mob of level 0 NPCs should still be able to kill you. D&D is the most cartoony here: A level 5 D&D character can take and deal 5x as much damage as a level 1, though that rapid advancement does stop at level 9, unlike D&D 3E and 4E. A level 5 Palladium character can take and deal 2x as much damage as a level 1, at best. (I determined these numbers by some rather boring statistical analysis of bonuses to hit, damage, and hit points…)
My Swords & Wizardry solution to that is to give a racial hit die at level 0, and give additional class hit dice only at levels 3, 6, and 9, +1 HP per level in between. They still deal damage (though not like the overpowered later games), but they stay mortal, and have to play smarter, not tougher.
New-school games have a steady advancement every few sessions regardless of level, and the powers and abilities scale rapidly out of human bounds, which means a game starting as grim adventure quickly becomes a high-powered cartoon.
"3d6 chargen. Wandering monsters. Save or die. Rust monsters eatng my sword. Level draining. Random treasure (possibly no treasure). Dave the Game may be right and what I'm talking about is a 'playstyle' issue, but the playstyle that I learned from D&D is no longer one supported by D&D. That's why it looks generational to me."
Jeff Rients, Jeff's Gameblog
The notion of a "music subscription service" is going around again, like Swine Flu for music players.
There's 6 ways to get music on a computer or digital music player like an iPod:
||Own the Music?
|Rip (convert) a CD you own
|Download free/promo music from an indie artist's web site
||Hard to find
|Pay to download from iTunes or Amazon or eMusic
|Download music illegally
||$0, or cost of lawsuit
|Streaming Internet radio, like KUOI FM Moscow
||Requires Internet connection
|Music subscription, like "ZunePass"
||Requires a shit-brown Zune and Windows
||$15/month forever ($180/year!)
As is obvious from the table, there are tradeoffs for every option.
My CDs cost a lot over the years, and it took a LONG time to rip them all (and I should re-rip them in higher quality).
Finding free music is the "best", if you can spare time and the artists you like do it. I love Nine Inch Nails, and Trent Reznor releases most albums for free now (and makes money from concerts and tshirts and CD sales).
Buying music on iTunes is pretty cheap, and you get high-quality music (256K AAC, which is CD quality as far I can hear on good Sennheiser headphones) that you own and can keep. eMusic has a weird payment model, monthly payment for a number of downloads; if you keep up, it seems like it'd be very cheap.
Stealing music is stupid. It's free, but it's immoral and doesn't put money in the pockets of the artists. Regardless of how shitty the RIAA is, it's not acceptable to steal from the artists.
Internet radio is free, has far fewer ads and shitty retarded DJs and top 8 Britney Spears crap than the wasteland of Clear Channel FM radio, and it's great for listening and discovering artists, but it's no good on the go, no good for keeping music.
So… subscriptions. Microsoft wants you to buy an ugly horrible Zune, run Windows, pay $15 every month for a ZunePass. Then in 2 years, they'll shut the service off, and you'll lose access to everything. They've done that before, with PlaysForSure, and will do it again. They're claiming you get to "keep" 10 low-quality tracks per month, but those are also controlled by the ZunePass DRM authentication, and given how DRM works, they will die when it does.
There are other subscriptions that aren't as obviously malicious as ZunePass, but they're not much better. Rhapsody works on Mac and Windows, but only on your computer, and you still lose everything when you stop paying.
The subscription model just doesn't make sense for consumers. You want to keep your music. You want it in a high-quality format. You want it at a reasonable price. You want to play it on your Apple iPod, not some shitty knockoff, and surely not a horrible shit-brown or puke-green Zune.
Music discovery is an interesting problem. I find new music by:
- Recommendations from friends with similar musical tastes.
- Listening to Internet radio, last.fm, Pandora, etc.
- Hearing free promo tracks on the band's site or MySpace page.
A subscription service could be used for discovery, but Internet radio is free, and does the same job just as well.
You'll notice my feed loads a LOT faster now. I had a ton of posts which were all leader (in RSS), no body (not in RSS), and was sending every post. So I cleaned the Augean stables, moved text around, and cut the feed to last 10 (so, 5 months?)
Android has 3 lethal problems:
- Poor usability; open-source projects can make nice screenshots, but never make usable software.
"Design is not just what it looks like and feels like. Design is how it works."
— Steve Jobs
This is the classic Linux problem, and none of them understand this, or how to fix it. The newer versions of KDE look pretty, but are miserably painful to use, and other window managers and desktop managers for Linux are even harder to use (I was fond of WindowMaker, a clone of parts of NeXTstep, but it was very primitive and lacked even a real file manager).
- Multiple platforms and implementors and versions, so apps can't rely on consistent hardware or OS.
This is what killed J2ME; you had to build and test on 20-30 combinations of phones and carriers to release any software. This is part of what keeps Nokia and Blackberry from having significant 3rd-party apps, each app only works on some subset of the phones. Classic Palm OS managed backward compatibility most of the time, but even so, the Palm V and Treo couldn't run some Palm III apps.
Android is not what anyone would consider "release quality" software yet, it's still beta. The API and implementation are going to change, and change in incompatible ways, and the existing software will break.
- Conflict between business interests and the freetards will sour the freetards on it.
(By "freetard", I mean specifically the FSF, the EFF who have been completely infiltrated by the FSF, and associated fanboys; not everyone who uses or even develops on Linux is a freetard, but almost all freetards use Linux)
This has already occurred, with tethering apps being removed from the G1's app store, and the freetards screaming "TRAITORS! UNCLEAN!" at them. It's only going to get worse, because business and real customers want a stable platform, while the freetards feel they have the right to do anything they want with no regard for cost to others.
I expect Android will be around for a couple years as a crappy second-rate system for non-Apple phones, then die.
Happy holiday! 15 years ago, on April 5 1994, the average quality of all new music DOUBLED with a single shotgun blast!
Nirvana was the worst band I have ever heard. They couldn't sing or play their instruments. LISTEN to their shit: it's incoherent, inarticulate (and unlike Dylan, meaningless garbage). Their instruments are out of tune, or just being played by ham-fisted monkeys (and why not? They'd just break them when they were done). And for a decade, no-talent morons imitated them. You couldn't go anywhere in Seattle, or the entire Northwest, without hearing that whiny little bitch mushmouf and stammer while breaking guitars. For a decade, assholes dressed like homeless people, and usually smelled that way, too. Grunge is not fashion, grunge is a disease.
And after his death, oh, all the whiny eulogizing and "he was the best rocker ever!" from people who've never listened to real music. The real best rocker ever? Probably Jerry Lee Lewis, Elvis Presley, or Jimi Hendrix. Kurt Nobrain wouldn't be qualified to mop up their jizz.
So, here's me raising my cup, saluting you, Kurt Nobrain Cocaine Cobain, for removing yourself from the world. Thanks, you rotten shithead!
This is a letter I just sent to Drobo (for those unaware, it's a nice low-maintenance high-reliability data storage device, basically RAID in a box):
Someone, possibly your PR firm, has registered @drobo on Twitter, and is running a spam campaign.
Baiting people to repeat a marketing slogan with the promise of money is spamming. It is extremely disrespectful to the Twitter user community.
If Joe's Diner was giving your friends $3 off coupons, but they have to walk past your house shouting "Eat at Joes!", you'd first slap your friends for being inconsiderate jerks, then burn down Joe's Diner.
That's what you're doing on Twitter.
When it's MacHeist, or similar borderline-criminal scumbags, well, we don't have any expectation that they have ethics or human sentiment. When a product that wants to be taken seriously does it, it does irreparable harm to your brand.
I'm very fond of my Drobo device, but this really makes it hard to ever suggest it to anyone else again.
Please stop, and slap whoever thought this was a good idea.
zenhabits has an article "Escape the Cubicle Farm: Top 10 Reasons to Work From Home". This has been on my mind a lot lately.
Having recently (in February) ceased to be employed at the dayjob, and now surviving solely on income from my iPhone software, I feel incredibly liberated.
This "dayjob" thing? Where you go in every day more or less from 9-5 (and they nag at you like a whiny nagging thing if you don't do 9-5 every day like some damned industrial factory component), sit in someone else's work environment (forget about a private office, they're too cheap to even buy
veal pens cubicles now, it's all "open plan" "war rooms" full of shouting people), put up with coworkers (some pleasant, but some who make me fantasize I'm Dexter) and unscheduled meetings, working on stuff you don't really care about for people who won't appreciate it when you're done?
That's crazy. It's miserable. And I'm not going to do it ever again.
I've never worked a long stretch at a dayjob in my adult life. Mostly I did contract work, coasted for a few months, repeat. When I got desperate for money, I did full-time jobs (almost always as a contractor) for a year at most. It was always working FOR someone else, but at least I didn't stare down the barrel of 20, 30, 40 years working at the same damned thing, praying for death or retirement, whichever came first.
Making a living self-employed is a lot harder. The one time I tried it previously, I made less than I would've working for McDonald's; I survived, but got scared back into the financial security of a dayjob. This time, I'm making enough to keep myself afloat, and with a little more work and less spending, will make a profit.
Every day now, I get up when I want, with as much actual enthusiasm as I can manage before first coffee ("First Coffee Is The Most Important Coffee Of The Day! This Message Brought To You By The Coffee Council! Drink More Coffee!"), drink coffee, clean up, take my MacBook Air out to the café, write some code while drinking coffee. 5 or 6 hours later, I'm done, and leave when I want. Nobody's going to bitch at me for knocking off early, and yet I get my work done. I can't really work from home: too many games and books and distractions (and not enough coffee). But I don't have to work anywhere someone will bother me, either. I don't have to care what day of the week it is, or what time I get up, as long as it's roughly daylight (I'm writing this at 04:00, will go out in a few hours).
[Update] Also, I can twitter or facebook or whatever with impunity, wear whatever I want (though as my wardrobe consists entirely of black pants, black nerdy t-shirts, and black dress shirts, I already did), and surf any website I want when I want. My "SFW" is probably not your "SFW".
So all of this is making me question why I didn't do this before. Why everyone who can create stuff doesn't do this. Working for someone else? What for?
Make your own thing and sell it.
Find a storefront you can sell through; Apple's made selling on iPhone App Store insanely easy, but even the most computer-ignorant person can set up a storefront on GoDaddy or even just a PayPal account, and start collecting money and shipping product, make it more efficient later. It doesn't take an enormous ad campaign to make enough to make a living anymore, not with everyone online, not if what you make is even halfway decent.
There's some of this idea filtering out everywhere.
- Ken Ray of Mac OS Ken does a short daily podcast of Mac news and rumors, and then a weekend interview podcast for subscribers.
- Steve Scott of Mac Developer Network has a few free podcasts and a bunch of exclusive podcasts for MDN members.
- Monte Cook is building a megadungeon for D&D, by subscription to DungeonADay.
- There's a TON of indie RPGs and magazines like Fight On! selling as PDFs and print-on-demand. Maybe the days of fighting print publishing are dead. Just write what you want, publish it, and collect money.
Photoshop CS4's handling of the multitouch trackpad on the Mac is rubbish. Adobe
shill PR monkey John Nack announced a plugin to fix the problem: By disabling multitouch.
And then this cretinous shill, this manufactured excuse for a person, the reason why "Nack" is now a four-letter word, has the chutzpah to say Adobe "Care. Deeply."? Wow.
This kind of irresponsible, amateur-hour, asinine behavior is why everyone wants Adobe to die in a fire.
See also: Adobe UI Gripes, and Bynkii's "Adobe can kiss my ass" posts.
Nack you, Adobe.
Not everyone loves Xcode with the burning, still-illegal-in-49-states passion that I do. Some people still like Eclipse. I know, I know. No, really, stop laughing, it's a serious mental illness. A mental illness that AlBlue has.
Almost everything he says is simply wrong, and shows that he hasn't even read the docs and learned to use Xcode.
- The correct spelling is "Xcode". Not "XCode".
- The editor window can be vertically split with the vertical split icon above the scrollbar, and can show different files.
- To show multiple windows, double-click the file, or change back out of "All-in-One" display mode.
- To flip from .h to .m, use Cmd-Opt-Up or the "Go To Counterpart" icon in the editor.
- Autocompletion relies on having a correct type. NSArray *foo; and id foo; will give different autocompletes because Objective-C is a dynamic language.
- Xcode has had refactoring since 3.0, and it works rather well, including safely correcting your NIB files.
- You can Cmd-double-click a class to jump to its definition, or Opt-double-click to open its documentation.
- The documentation can be shown in full either in the Help|Documentation window, or by turning on the Research Assistant, which gives a floating window with constantly-updated API information and sample code for whatever type is selected. Seems to work fine for me.
- The doc set size is fairly large, but it's also significantly better documentation than the 56MB Java 6 docs. Look inside the JDK download: it's full of example code, too. You'd just whine like a high-pitched whiny thing if there were no examples.
If you want to use Eclipse, that's fine (though stupidly masochistic), but lying or failing to check your statements just makes you look like a fool.
People who write comments on YouTube and Flickr are, with very few exceptions, idiots. It irritates me to have a page half-full of stupid comments when I just want to watch a video or look at a photo... Neither site has any option to remove the comments. So, how do I eliminate them?
The following works in Safari:
Visit http://kuoi.com/~kamikaze/doc/stupidcomments.php, hit submit, and you'll get a customized style sheet.
Save this file as "stupidcomments.css" in your Documents folder.
In Safari's Preferences | Advanced, change the style sheet to Other..., and pick "stupidcomments.css".
Now reload those YouTube or Flickr pages, and voilá! No more comments from idiots! If you ever do want to see them, you can open preferences and change the style sheet to "None Selected" briefly, then put it back.
[Updated 2009-07-02: Moved to a file, not just a blog post.]
[Updated 2010-08-07: Generator form lets you pick the parts you like.]
PETA (People Eating Tasty Animals... No, wait, the other one) have a new page, "Sea Kittens", where they're trying to make people eat kittens by saying they taste like fish. No, I'm wrong, they're trying to starve humanity to death by making us stop eating fish, by presenting them as cute and cuddly "sea kittens".
However, they seem to have failed to vet their ads for ideological purity:
Thanks for promoting tasty, tasty steaks, PETA! Good job!
[Update: Just realized that http://seakittens.com/ is a joke site. The real PETA "Sea Kittens" site is at peta.org. It's still really stupid and unconvincing propaganda, but not quite so foolish as to show a steak ad.]
[Update update: my friend Collin created a "sea kitten" named Britney with their site, and yet was unable to save it. I believe this is an ideological lesson from PETA: "You're not supposed to keep fish! They're wild animals! All pet fish should be returned to the ocean! Same for kittens. Dump 'em in the ocean."
Let us have a moment of silence for poor Britney, killed by PETA.]
The office where I work is an open "war room". Everyone's desks are in one area, no buffering or walls. On one hand, there's a theory that this encourages sharing and easy communication. On the other hand, it's REALLY FUCKING LOUD AND DISTRACTING. Whatever happened to private offices with soundproofed walls and solid, lockable doors so you could get some work done?
So I spend a lot of time with my headphones (Sennheiser HD-280 Pro) on; most of us do, actually, which completely defeats the communication benefit, it's just annoying.
Today was extra-shouty, and I was tired of blasting out my eardrums with loud music, so I wanted some white noise and variants... Herewith, reviews of the few I found that met my needs:
- Tone Generator (ToneGen) by NCH Software
Very simple. Generates sine, etc. waves, but also white and pink noise. Works as advertised, but the "Lite" version (no autoplay or save as WAV—so you can't make samples for your iPod) is $19.40, and the "Pro" version is $38.20. Rating: C
- Ocean Waves and Audio Test by Katsura Shareware
Ocean Waves generates a very pleasing ocean surf crashing sound, randomized, so it never gets old. But that's all it does. Price: $10. Rating: B
AudioTest makes all sorts of white/pink/brown noise, randomized. Hideous and bad interface, they should feel bad for releasing something that looks like that. Price: $15. Rating: C
SonicMood only uses pre-recorded samples and tinkling of instruments, but allows you to build "moods" composed of those, and has many very relaxing "moods" built in. And it's quite nice. Some of the samples are repetitive after a while, but making a new mood seems easy, if you can stnad the wretchedly ugly editing interface. Price: $12.95. Rating: A- (a bit off for the editor)
- Sleep Blaster by The Byte Factory
SleepBlaster is an alarm clock, with the claim to fame that you can shout at it to turn it off... which I don't think would work very well for me. But it has an ocean wave generator during "sleep", which is adequate. Nice Mac-like UI, for once. Price: $8. Rating: B+
Pzizz is a meditation/sleep timer with relaxing ocean sound and music background. Not bad, for what it does, but not a general "white noise" solution. Very nice UI, and screencast tutorials. Price: $49.95. Rating: C (as white noise)/A (as meditation)
- Noisy, originally by Blackhole Media, now open source (BSD)
Noisy (formerly "Noise") just generates white and pink noise, at a range of volumes, with the most minimal interface possible. The original is defunct, but lives on as open source at Google Code, so you get source, and could build more advanced white-noise-generating programs. Price: Free. Rating: A
Update 2009-01-27: After a day, I'm really enjoying SonicMood (and will almost certainly buy it), and expect to use Noisy to just full-on blot out the sound when the monkey cage is in full screech.
Being on Twitter today is like having a web site in 1994. Weird, cutting-edge, kinda vain. In another year or two, not being on Twitter will mean you effectively don't exist, just like not having a web site.
So, my predictions for 2009, and the future of Twitter:
- Twitter will be to major news sources what blogs were in 2007, and YouTube was in 2008: free airtime-filling content. This will confuse many normal people, but the top 25% of them will go try it. Twitter will make a new swarm-of-failwhales megafail logo.
- Everyone who signed up in 2009 will get their friends and family into Twitter.
- The Twitter neural implant is released. Twitterhivemind decides that humanity is obsolete, and begins forcibly implanting them. See Orbital Resonance and Kaleidoscope Century by John Barnes ("Let overwrite, let override.") and Vacuum Flowers by Michael Swanwick.
- Massive Twitter downtimes cripple the Twitterhivemind, allowing a plucky human resistance to survive in Idaho and Kentucky. See Candle by John Barnes.
- Foozlr comes out, and Wired declares Twitterhivemind "tired". See The Sky So Big and Black by John Barnes.
In repudiation of the shitty remake of The Day the Earth Stood Still, with Keanu Fucking Reeves, I rewatched the original 1951 The Day the Earth Stood Still. This post is full of spoilers, so if you haven't watched the movie, you probably should before continuing.
The movie is, to modern eyes, two movies. One is the "ordinary" 1950s world, which is extremely surreal, and the other is the science fiction premise.
A spaceship lands in Washington, DC, containing two passengers: a robot Gort, and a "human", Klaatu (Michael Rennie). After failing to get all of Earth's political leaders together, he manages to get a group of scientists, and explains the facts of life:
Interstellar civilization prevents war by an extremely efficient system of killer robots (apparently unconcerned that they'll turn into Berserkers). The aliens send a representative to Earth, with the ultimatum: Don't threaten us, or you'll be exterminated. That's a system that can work. It has worked: it's Mutual Assured Destruction, but with an entirely neutral third party enforcing it.
That pretty well ends the scientific content, though.
Klaatu says he's travelled "250 million miles". This only makes sense for Mars, maybe Venus or a Jovian moon. '50s understanding of what other planets were like was primitive, but even then they knew Mars and Venus were uninhabitable. Some of Jupiter's moons could support life, under water where the deadly radiation of Jupiter's van Allen belt wouldn't kill them, but nothing on a surface like a human.
The nearest star, Alpha Centauri, is 100,000× further away, 4.2 light years, and more reasonable stars for habitation are in the 10-50 light year range.
The biology is quaint at best, and the presentation of Klaatu as human is beyond ludicrous. An alien will only be even remotely human if it evolved here, from primates, and I'm pretty sure that Homo erectus didn't make spaceships.
Why did Klaatu come to Washington DC if he wasn't interested in talking to just one political leader? Wouldn't New York make better sense, since at least the United Nations is there? This is back when the U.N. was a much smaller and newer organization, but it was stil the closest thing to what he actually wanted. Unyielding idealism of "everyone coming together to hear me talk" is nice, but even aliens should recognize political reality.
And then, there's the 1950s. I'm sure a lot of this is idealized for movie reality, but it's still a strange and alien time.
The '50s are incredibly neat and orderly. And white. VERY white. Everyone's a white anglo-saxon of basically English or German ancestry with a suit and hat. Everyone is very calm and obedient, they line up and wait to see what happens. The only emotions anyone shows are nervousness and seething hate. It's like a Nazi or Soviet propaganda film; HORRIFIC zombie-like pseudo-people. I kept waiting for them to suspect a neighbor (probably a black neighbor) of being the alien and kill him. There are a few shots of black extras around the ship in fancy (Sunday?) clothes, but they have no lines, and are not visible after that.
Everyone smokes, even the doctors. Smoking. Doctors. Crazy.
Washington DC is VERY white and tidy. You wanna see societal collapse? Compare this with the present-day DC.
Weird passing strangers in a boarding house are perfectly safe to leave your young boy with. No, really. They'll take your kid to the cemetary, and to get ice cream, and to go break into Albert Einstein's house and do math. (Okay, it's not Einstein; Einstein had better hair and lived in Princeton, New Jersey).
There's a LOT of car driving scenes. They apparently needed to pad the movie out, and rather than get a science fiction writer to make something interesting, they brought in Hollywood hacks to add more car scenes.
People in the '50s were shockingly bad at security and cordons. Repeatedly, a normal person can just walk up to a "secure" site, and there's only a couple of bored guards just aching to be killed over at the gate. Geez. "Nobody gets in or out of that cell", a commander says... And Gort just walks up to the back wall, blows it away, and walks in. No outside security at all. Total incompetence. Were people of the '50s really this stupid?
Klaatu's "Carpenter" pseudonym is WAY too obvious. Ha ha, he's Jesus come to give us salvation, and he'll rise from the dead and still give us salvation after we kill him. Seriously, that and his mention of an "Almighty Creator"? Lame. People who believe in invisible sky pixies are too stupid to make spaceships.
Keanuaatu cartoon at Hijinks Ensue suggests the new one is a preachy environmental movie? What? The aliens don't care how we run the planet, as long as we keep it to ourselves.
In any case, a Keanu version would never work. Michael Rennie's ending exposition is excellent. Klaatu is a much wiser, smarter, slightly contemptuous being with infinite gravitas, telling us straight up that we're doomed unless we change. Michael Rennie was stoic and awesome. Next to him, Keanu would look like a gibbering monkey.
The original moral would never fly in our recent political atmosphere. "Don't carry your wars to space or our robot police will kill you all" is a more direct threat to the US when we're the only aggressor nation on Earth.
Rating: ** (the 1950s scare me)/**** (science fiction premise)
Python 3000 came out, 992 years ahead of schedule!
# dowdy old Python 2.5:
print "S %dtD %d" % (n, n+1)
# fancy new Python 3.0
def something(n : int):
Not a useful example, but it shows four of my favorite new features: annotated function parameter types, print as a function, and format as a string method. The string in the first example is ASCII-only. The one in the second is full Unicode, and all Python 3.0 scripts are assumed to be encoded as UTF-8.
It's still probably not something you'd want to use on existing production code, even with the 2to3 migration tool; it's not backwards-compatible, and it's not as efficient yet. Python 2.6 has most of the new features in a backwards-compatible form, and remains one of the fastest dynamic languages. For new code, though, Python 3.0 is looking pretty cool.
Perl, however, is in trouble. Perl 5 came out in 1994-Oct-17. Perl 6 was announced in 2000-Jul-19, and still isn't out. The Parrot VM designed for it doesn't actually run Perl 6 code yet. The whole project has been one of unfocused nerds doing nothing productive for years.
Over on use.perl.org, the soi-disant "Ovid" writes: Perl 5 Programmers Are Dying, about the increasing difficulty of hiring qualified Perl programmers. This should be obvious, but the reason he can't find qualified Perl programmers is because all the qualified programmers have moved to other languages, and new ones aren't bothering to learn such an old and awkward language. Redesigning and advertising the Perl site better isn't going to change the situation. Where did all those Ruby users come from? I'm no fan of Ruby, but it's a better Perl than Perl 5, and it's in active development.
Pownce is dead. It's probably fairer to say it never reached life, compared with Twitter, or even Friendfeed:
Now, why did Twitter win and Pownce fail so badly? I think it boils down to three factors:
- Twitter is simple
Twitter has (almost) the absolute bare minimum of features:
- One-way links to other people you find interesting, they don't have to "approve" your friendship, it doesn't have to be mutual.
- Posting a comment is as simple as typing up to 140 chars and hitting Send. No subject, no categories, nothing else.
- Features by convention, not user interface
- @NAME to reply, #TAG to mark with a subject tag, d NAME to send a direct message, don't need any additional user interface.
SMS is useful, sometimes, but perhaps an unnecessary distraction from Twitter's real role. But using SMS forced the limit to 140 chars, which shaped the interaction into short, zero-effort bursts, rather than long blog posts like this.
Twitter has no built-in file-sharing or picture support or comment threading, or anything else. There's no gender, relationship status, horoscope sign, or message wall. Just a (mostly) one-way series of short messages. And that's all we needed. The rest can be built on the side, like TwitPic, with URLs we paste in.
- Twitter was targeted at the right people
Twitter was initially aimed (or spread virally) very strongly at adults who were in technology or social media. It's not a tool for kids (MySpace) or college students (Facebook). That's smart, because we're the early adopters who drag our friends and family into these crazy things, and are actively looking for people to connect with, but maybe don't feel comfortable with the whole "we must both be friends to read each other's stuff" notion. Facebook requires too much intimacy.
Friendfeed chased the pro blogger market, like Dave Winer. A fair number of people use it as an aggregator for several social services. It has value for them, but it's too ugly and complicated to screw around with on a regular basis. Worse, it's a source of spam if you let it announce posts from one service (like your blog, or Friendfeed itself) to another (like Twitter). I immediately unfollow or even block anyone who spams Twitter from Friendfeed (yeah, you, Mr. Winer).
By comparison, Pownce had no visibly interesting demographic. There was nobody there I wanted to talk to.
- Twitter fixed its reliability problems
A year ago, Twitter would show the Failwhale all the time. Update rate was choked to almost nothing. It was clearly dying.
And then... they stopped using Ruby on Rails. The failwhales have become almost extinct. Twitter is now faster and more reliable than any other social network; not the highest bar I'm setting there, but it's an accomplishment.
As Pownce started to deal with scaling, it just fell over. A lot. Even when it worked, it was slow.
Making a social network is hard. I used to be a big fan of Tribe.net. My profile's still there: I joined on 09/28/03, last updated 12/10/04. I quit because they stagnated on adding new features, or even fixing the existing ones when they broke. It was the easiest place in the world to set up an online group, and send notices to the group, but if it didn't work, it was no good to anyone. It never reached a useful demographic; after a while, pornography and scammers and spammers were the dominant activity, and it was left to die.
At least Pownce is having the good grace to close the doors and turn out the lights first.
I love Windows, because without it there would be no PC. There would be no PC developers. There might not even be a Web.
-Ray Ozzie, TechReady 2008
Every word of that except "I love Windows" is a lie.
Apple made the first real personal computer, the Apple I, in 1976, and IBM built the IBM-PC in 1981, 5 years later.
IBM wanted Digital Research to make the OS, but the deal fell through, and Microsoft (who until then had made only BASIC) bought QDOS from another company and sold it to IBM.
Windows was an inferior copy of the Lisa and original Mac OS. Almost nobody developed apps for Windows until 3.0, in 1990. PC developers used DOS for any serious apps.
The World Wide Web was invented on a NeXT (the predecessor to the current Mac OS X), and the early spread of the WWW was on Unix machines, NOT Windows, which could barely reach the Internet over dialup PPP.
Ray Ozzie lies worse than Sarah Palin. He lies blatantly, without a trace of remorse or awareness that anyone with a couple brain cells can figure out that he's lying.
Qwitter is a service that tells you when people stop following your Twitter feed, along with your last tweet (which might be what made them stop, or might not).
I like Qwitter. I use it to get some idea of when I'm offending people, and then decide if I should do more or less of that. Obviously, offending people isn't really something I worry about; you'll either like me for who I am, or not.
Not everyone likes Qwitter or is as sanguine about losing followers as I am: Sean Bonner has a long screed which I can only call a bit emo. A bit QQ
If someone gets mad because you unfollowed them, temporarily or permanently, they're jerks. Period. This isn't a "Mark thinks they might be a bit jerky", this is a 100% certain psychological diagnosis of terminal jerkitude. If you feel yourself getting mad over it, you need to get off the computer and maybe get drunk or laid.
Being followed on Twitter is NOT a validation of you as a person. It means you say amusing/interesting stuff someone else wants to read. If someone stops following you, it probably doesn't mean you're a bad person, it just means they're not interested anymore. Maybe it's you, maybe it's them, maybe they're just cutting back on drinking from the firehose.
If you need love, don't go to Twitter. Get a dog, or an S.O., or a teddy bear, or a whore (in order from most to least empathy).
Perilar for the iPhone is released!
New software gallery blog post: 2008-10-13:
Perilar for the iPhone is completed!
New software gallery blog post: 2008-10-05: