A group of us, have formed #OpenCouncil, a small working working group around transforming @NYCcouncil to be more open. We hope to pressure the Council to be more open, transparent, and digitally savvy. Join us.
Hugh MacLeod says the price of being a sheep is boredom, and the price being a wolf is loneliness. My experience as a wolf says he’s right, but I think it’s bullshit, so I am writing this. There are lone wolves, but wolves are also pack animals, so how do you reconcile the two? How can we be independent together?
In this way, CATS’ ads—all of which can still be seen on his campaign website—are both a delight and a perfect representation of the candidate they were designed to sell. They worked, even though they didn’t. They were expensive, but too knobby and odd to hold much of a polish. They failed, but they were baffling and wonderful.
For the last two weeks, I’ve been struggling to understand why NYC’s comptroller launched an app called “NYC 311+, The Big Apple’s best 311 App” that has no direct connection to NYC’s 311 infrastructure. The City’s comptroller claims this app is better because it is “social” and is available in “more” languages. Yet, it misses out on several critical features of NYC’s 311 system – #1, a voice interface – #2, an SMS interface – #3 holistic integration with NYC’s 311 service…
Once a review is submitted, NYC 311 forwards it to the City agency or authority responsible for oversight and maintenance. A subway review gets seen by the MTA; playground reviews are submitted to the Parks Department; and potholes complaints go to the Department of Transportation.
No matter how much of a critique this is of the current 311 system, this app misrepresents itself AND erodes the public trust in government technology. I call on the Comptroller to rename this app and re-think its advocacy.
Also note, the Comptroller was running for Mayor when this app launched one week before election day at the NY Tech Meetup…
vinces99 writes “It’s becoming more common to have robots sub for humans to do dirty or sometimes dangerous work. But researchers are finding that, in some cases, people have started to treat robots like pets, friends or even as an extension of themselves. That raises a question: If a soldier attaches human or animal-like characteristics to a field robot, can it affect how they use the robot? What if they ‘care’ too much about the robot to send it into a dangerous situation? Julie Carpenter, who just received a doctorate in education from the University of Washington, wanted to find out. She interviewed Explosive Ordnance Disposal military personnel – highly trained soldiers who use robots to disarm explosives – about how they feel about the robots they work with every day. What she found is that troops’ relationships with robots continue to evolve as the technology changes. Soldiers told her that attachment to their robots didn’t affect their performance, yet acknowledged they felt a range of emotions such as frustration, anger and even sadness when their field robot was destroyed. That makes Carpenter wonder whether outcomes on the battlefield could potentially be compromised by human-robot attachment, or the feeling of self-extension into the robot described by some operators.”