Tag Archives: smartphone

Curate Your Own Death

six-feet-under-opening-title

It’s your funeral. So why not manage it yourself.

A new crop of smartphone and web apps aims to deliver end-of-life planning services directly to your small screen. Not only can you manage your own funeral, some of these services even help you curate your own afterlife. Apparently, apps like Cake, SafeBeyond, Everplans and Everest, are perfectly suited to millennials, many of whom already curate significant aspects of their lives online.

From the Guardian:

A young man is staring straight into the camera. He looks late 20s or early 30s, with a suede blazer and two-toned hipster glasses, and cheerfully waves as he introduces himself. “Hi, my name’s Will,” he tells the YouTube audience. “And I’m dead.”

“While my family is a bit upset, they’re not stressed. Because when I was among the land of the living, I made the incredibly smart move of signing up for Everest.”

Will flashes a smile. His family plans his funeral in the background, using the detailed plan he left behind.

Everest is a Houston-based funeral concierge, and the firm that commissioned Will’s upbeat, millennial-friendly video last fall from Sandwich Video, a Los Angeles production company popular with the tech set in Silicon Valley. Everest published the film in February 2016 as part of a campaign to target millennials, hoping even twentysomethings can be lured into thinking about their digital afterlives.

Everest is just one of a wave of apps and digital services that are emerging to help millennials plan their own #authentic mortal passings, right down to Instagram-worthy funerals. Last fall, rival apps Cake and SafeBeyond were released within one month of each other, and both hope to streamline end-of-life planning into one simple app.

Death apps promise to help a person organize his or her entire online life into a bundle of digital living wills, funeral plans, multimedia memorial portfolios and digital estate arrangements. It could be the mother of all personal media accounts, designed to store all of a person’s online passwords in one spot, for a successor to retrieve after he or she dies.

But millennials already curate their digital lives to perfection on social media. So how much are these “death apps” adding just another layer of pressure to personalize yet another stage of their lives?

Read the entire story here.

Image: Six Feet Under, opening title. Courtesy: HBO / Wikia.

Shear Madness/Genius: Smartphoneless For 18 Months

Model5302TelephoneRead the following sentence and you’ll conclude that this person is stark-raving-mad.

Writer Jenna Woginrich jettisoned her smartphone and lived 18 months without mobile calls and without texting, status updates and alerts.

Now read her complete story, excerpted below, and you’ll realize that after 18 months without a smartphone she is perfectly sane, more balanced, less stressed and generally more human.

From Jenna Woginrich via the Guardian:

The phone rings: it’s my friend checking to see if I can pick her up on the way to a dinner party. I ask her where she is and as she explains, I reach as far as I can across the countertop for a pen. I scribble the address in my trusty notebook I keep in my back pocket. I tell her I’ll be at her place in about 20 minutes, give or take a few. Then I hang up. Literally.

I physically take the handset receiver away from my ear and hang it on the weight-triggered click switch that cuts off my landline’s dial tone.

I take my laptop, Google the address, add better directions to my notes and head outside to my 1989 pick-up truck (whose most recent technological feature is a cassette player) and drive over. If I get lost on the way, I’ll need to ask someone for directions. If she changes her plans, she won’t be able to tell me or cancel at a moment’s notice. If I crash on the way, I won’t be calling 911.

I’m fine with all of this. As you guessed by now, I haven’t had a cellphone for more than 18 months.

I didn’t just cancel cellular service and keep the smartphone for Wi-Fi fun, nor did I downgrade to a flip phone to “simplify”; I opted out entirely. There is no mobile phone in my life, in any form, at all.

Arguably, there should be. I’m a freelance writer and graphic designer with many reasons to have a little computer in my holster, but I don’t miss it. There are a dozen ways to contact me between email and social media. When I check in, it’s on my terms. No one can interrupt my bad singing of Hooked on a Feeling with a text message. It’s as freeing as the first night of a vacation.

“My phone” has become “the phone”. It’s no longer my personal assistant; it has reverted back to being a piece of furniture – like “the fridge” or “the couch”, two other items you also wouldn’t carry around on your butt.

I didn’t get rid of it for some hipster-inspired luddite ideal or because I couldn’t afford it. I cut myself off because my life is better without a cellphone. I’m less distracted and less accessible, two things I didn’t realize were far more important than instantly knowing how many movies Kevin Kline’s been in since 2010 at a moment’s notice. I can’t be bothered unless I choose to be. It makes a woman feel rich.

Read the entire story here.

Image: Western Electric Model 5302 telephone Courtesy: ProhibitOnions, 2007 / Wikipedia. Public Domain.

iScoliosis

Google-search-neck-xray

Industrial and occupational illnesses have followed humans since the advent of industry. Obvious ones include: lung diseases from mining and a variety of skin diseases from exposure to agricultural and factory chemicals.

The late 20th century saw us succumb to carpal tunnel and other repetitive stress injuries from laboring over our desks and computers. Now, in the 21st we are becoming hosts to the smartphone pathogen.

In addition to the spectrum of social and cultural disorders wrought by our constantly chattering mobile devices, we are at increased psychological and physical risk. But, let’s leave aside the two obvious ones: risk from vehicle injury due to texting while driving, and risk from injury due to texting while walking. More commonly, we are at increased risk of back and other chronic physical problems resulting from poor posture. This in turn leads to mood disorders, memory problems and depression. Some have termed this condition “text-neck”, “iHunch”, or “iPosture”; I’ll go with “iScoliosis™”.

From NYT:

THERE are plenty of reasons to put our cellphones down now and then, not least the fact that incessantly checking them takes us out of the present moment and disrupts family dinners around the globe. But here’s one you might not have considered: Smartphones are ruining our posture. And bad posture doesn’t just mean a stiff neck. It can hurt us in insidious psychological ways.

If you’re in a public place, look around: How many people are hunching over a phone? Technology is transforming how we hold ourselves, contorting our bodies into what the New Zealand physiotherapist Steve August calls the iHunch. I’ve also heard people call it text neck, and in my work I sometimes refer to it as iPosture.

The average head weighs about 10 to 12 pounds. When we bend our necks forward 60 degrees, as we do to use our phones, the effective stress on our neck increases to 60 pounds — the weight of about five gallons of paint. When Mr. August started treating patients more than 30 years ago, he says he saw plenty of “dowagers’ humps, where the upper back had frozen into a forward curve, in grandmothers and great-grandmothers.” Now he says he’s seeing the same stoop in teenagers.

When we’re sad, we slouch. We also slouch when we feel scared or powerless. Studies have shown that people with clinical depression adopt a posture that eerily resembles the iHunch. One, published in 2010 in the official journal of the Brazilian Psychiatric Association, found that depressed patients were more likely to stand with their necks bent forward, shoulders collapsed and arms drawn in toward the body.

Posture doesn’t just reflect our emotional states; it can also cause them. In a study published in Health Psychology earlier this year, Shwetha Nair and her colleagues assigned non-depressed participants to sit in an upright or slouched posture and then had them answer a mock job-interview question, a well-established experimental stress inducer, followed by a series of questionnaires. Compared with upright sitters, the slouchers reported significantly lower self-esteem and mood, and much greater fear. Posture affected even the contents of their interview answers: Linguistic analyses revealed that slouchers were much more negative in what they had to say. The researchers concluded, “Sitting upright may be a simple behavioral strategy to help build resilience to stress.”

Slouching can also affect our memory: In a study published last year in Clinical Psychology and Psychotherapy of people with clinical depression, participants were randomly assigned to sit in either a slouched or an upright position and then presented with a list of positive and negative words. When they were later asked to recall those words, the slouchers showed a negative recall bias (remembering the bad stuff more than the good stuff), while those who sat upright showed no such bias. And in a 2009 study of Japanese schoolchildren, those who were trained to sit with upright posture were more productive than their classmates in writing assignments.

Read the entire article here, preferably not via your smartphone.

Image courtesy of Google Search.

 

Fight or Flight (or Record?)

Google-search-danger

Psychologists, social scientists and researchers of the human brain have long maintained that we have three typical responses to an existential, usually physical, threat. First, we may stand our ground to tackle and fight the threat. Second, we may turn and run from danger. Third, we may simply freeze with indecision and inaction. These responses have been studied, documented and confirmed over the decades. Further, they tend to mirror those of other animals when faced with a life-threatening situation.

But, now that humans have entered the smartphone age, it appears that there is a fourth response — to film or record the threat. This may seem hard to believe and foolhardy, but quite disturbingly it’s is a growing trend, especially among younger people.

From the Telegraph:

If you witnessed a violent attack on an innocent victim, would you:

a) help
b) run
c) freeze

Until now, that was the hypothetical question we all asked ourselves when reading about horrific events such as terror attacks.

What survival instinct would come most naturally? Fight or flight?

No longer. Over the last couple of years it’s become very obvious that there’s a fourth option:

d) record it all on your smartphone.

This reaction of filming traumatic events has become more prolific in recent weeks. Last month’s terror attacks in Paris saw mobile phone footage of people being shot, photos of bodies lying in the street, and perhaps most memorably, a pregnant woman clinging onto a window ledge.

Saturday [December 5, 2015] night saw another example when a terror suspect started attacking passengers on the Tube at Leytonstone Station. Most of the horrific incident was captured on video, as people stood filming him.

One brave man, 33-year-old engineer David Pethers, tried to fight the attacker. He ended up with a cut to the neck as he tried to protect passing children. But while he was intervening, others just held up their phones.

“There were so many opportunities where someone could have grabbed him,” he told the Daily Mail. “One guy came up to me afterwards and said ‘well done, I want to shake your hand, you are the only one who did anything, I got the whole thing on film.’

“I was so angry, I nearly turned on him but I walked away. I though, ‘Are you crazy? You are standing there filming and did nothing.’ I was really angry afterwards.”

It’s hard to disagree. Most of us know heroism is rare and admirable. We can easily understand people trying to escape and save themselves, or even freezing in the face of terror.

But deliberately doing nothing and choosing to film the whole thing? That’s a lot harder to sympathise with.

Psychotherapist Richard Reid agrees – “the sensible option would be to think about your own safety and get out, or think about helping people” – but he says it’s important we understand this new reaction.

“Because events like terror attacks are so outside our experience, people don’t fully connect with it,” he explains.

“It’s like they’re watching a film. It doesn’t occur to them they could be in danger or they could be helping. The reality only sinks in after the event. It’s a natural phenomenon. It’s not necessarily the most useful response, but we have to accept it.”

Read the entire story here.

Image courtesy of Google Search.

The Man With No Phone

If Hitchcock were alive today the title of this post — The Man With No Phone — might be a fitting description of his latest noir, celluloid masterpiece. For in many the notion of being phone-less distills deep nightmarish visions of blood-curdling terror.

Does The Man With No Phone lose track of all reality, family, friends, appointments, status updates, sales records, dinner, grocery list, transportation schedules and news, turning into an empty neurotic shell of a human being? Or, does lack of constant connectivity and elimination of instant, digital gratification lead The Man With No Phone to become a schizoid, feral monster? Let’s read on to find out.

[tube]uWhkbDMISl8[/tube]

Large swathes of the world are still phone-less, and much of the global population — at least those of us over the age of 35 — grew up smartphone-less and even cellphone-less. So, it’s rather disconcerting to read Steve Hilton’s story; he’s been phone-less for 3 years now. However, it’s not disconcerting that he’s without a phone — I find it inspiring (and normal), it’s disconcerting that many people are wondering how on earth he can live without one. And, even more perplexing — why would anyone need a digital detox or mindfulness app on their smartphone? Just hide the thing in your junk draw for a week (or more) and breathe out. Long live The Man With No Phone!

From the Guardian:

Before you read on, I want to make one thing clear: I’m not trying to convert you. I’m not trying to lecture you or judge you. Honestly, I’m not. It may come over like that here and there, but believe me, that’s not my intent. In this piece, I’m just trying to … explain.

People who knew me in a previous life as a policy adviser to the British prime minister are mildly surprised that I’m now the co-founder and CEO of a tech startup . And those who know that I’ve barely read a book since school are surprised that I have now actually written one.

But the single thing that no one seems able to believe – the thing that apparently demands explanation – is the fact that I am phone-free. That’s right: I do not own a cellphone; I do not use a cellphone. I do not have a phone. No. Phone. Not even an old-fashioned dumb one. Nothing. You can’t call me unless you use my landline – yes, landline! Can you imagine? At home. Or call someone else that I happen to be with (more on that later).

When people discover this fact about my life, they could not be more surprised than if I had let slip that I was actually born with a chicken’s brain. “But how do you live?” they cry. And then: “How does your wife feel about it?” More on that too, later.

As awareness has grown about my phone-free status (and its longevity: this is no passing fad, people – I haven’t had a phone for over three years), I have received numerous requests to “tell my story”. People seem to be genuinely interested in how someone living and working in the heart of the most tech-obsessed corner of the planet, Silicon Valley, can possibly exist on a day-to-day basis without a smartphone.

So here we go. Look, I know it’s not exactly Caitlyn Jenner, but still: here I am, and here’s my story.

In the spring of 2012, I moved to the San Francisco bay area with my wife and two young sons. Rachel was then a senior executive at Google, which involved a punishing schedule to take account of the eight-hour time difference. I had completed two years at 10 Downing Street as senior adviser to David Cameron – let’s just put it diplomatically and say that I and the government machine had had quite enough of each other. To make both of our lives easier, we moved to California.

I took with me my old phone, which had been paid for by the taxpayer. It was an old Nokia phone – I always hated touch-screens and refused to have a smartphone; neither did I want a BlackBerry or any other device on which the vast, endless torrent of government emails could follow me around. Once we moved to the US my government phone account was of course stopped and telephonically speaking, I was on my own.

I tried to get hold of one of my beloved old Nokia handsets, but they were no longer available. Madly, for a couple of months I used old ones procured through eBay, with a pay-as-you-go plan from a UK provider. The handsets kept breaking and the whole thing cost a fortune. Eventually, I had enough when the charging outlet got blocked by sand after a trip to the beach. “I’m done with this,” I thought, and just left it.

I remember the exact moment when I realized something important had happened. I was on my bike, cycling to Stanford, and it struck me that a week had gone by without my having a phone. And everything was just fine. Better than fine, actually. I felt more relaxed, carefree, happier. Of course a lot of that had to do with moving to California. But this was different. I felt this incredibly strong sense of just thinking about things during the day. Being able to organize those thoughts in my mind. Noticing things.

Read the entire story here.

Video: Hanging on the Telephone, Blondie. Courtesy: EMI Music.

On the Joys of Not Being Twenty Again

I’m not twenty, and am constantly reminded that I’m not — both from internal alerts and external messages. Would I like to be younger? Of course. But it certainly comes at a price. So, after reading the exploits of a 20-something forced to live without her smartphone for a week, I realize it’s not all that bad being a cranky old luddite.

I hope that the ordeal, excerpted below, is tongue-very-much-in-cheek but I suspect it’s not: constant status refreshes, morning selfies, instant content gratification, nano-scale attention span, over-stimulation, life-stream documentation, peer ranking, group-think, interrupted interruptions. Thus, I realize I’m rather content not to be twenty after all.

From the Telegraph:

I have a confession to make: I am addicted to my smartphone. I use it as an alarm clock, map, notepad, mirror and camera.

I spend far too much time on Twitter and Instagram and have this week realised I have a nervous tick where I repeatedly unlock my smartphone.

And because of my phone’s many apps which organise my life and help me navigate the world, like many people my age, I am quite literally lost without it.

I am constantly told off by friends and family for using my phone during conversations, and I recently found out (to my horror) that I have taken over 5,000 selfies.

So when my phone broke I seized the opportunity to spend an entire week without it, and kept a diary each day.

Day One: Thursday

Frazzled, I reached to my bedside table, so I could take a morning selfie and send it to my friends.

Realising why that could not happen, my hand and my heart both felt empty. I knew at this point it was going to be a long week.

Day Two: Friday

I basked in the fact my colleagues could not contact me – and if I did not reply to their emails straight away it would not be the end of the world.

I then took the train home to see my parents outside London.

I couldn’t text my mother about any delays which may have happened (they didn’t), and she couldn’t tell me if she was going to be late to the station (she wasn’t). The lack of phone did nothing but make me feel anxious and prevent me from being able to tweet about the irritating children screaming on the train.

Day Three: Saturday

It is a bit weird feeling completely cut off from the outside world; I am not chained to my computer like I am at work and I am not allowed to constantly be on my laptop like a teen hacker.

It was nice though – a real detox. We went on a walk with our spaniel in the countryside near the Chiltern Hills. I had to properly talk to everyone, instead of constantly refreshing Twitter, which was novel.

I do feel like my attention span is improving every day, but I equally feel anchorless and lost without having any way of contacting anyone, or documenting my life.

….

Day Seven: Wednesday

My attention span and patience have grown somewhat, and I have noticed I daydream and have thoughts independent of Twitter far more often than usual.

Read the entire account here.

Social Media Lice

google-search-group-selfie

We know that social media helps us stay superficially connected to others. We also know many of the drawbacks — an over-inflated and skewed sense of self; poor understanding and reduced thoughtfulness; neurotic fear of missing out (FOMO); public shaming, online bullying and trolling.

But, now we hear that one of the key foundations of social media — the taking and sharing of selfies — has more serious consequences. Social media has caused an explosion in head lice, especially in teenagers, particularly girls. Call it: social media head lice syndrome. While this may cause you to scratch your head in disbelief, or for psychosomatic reasons, the outbreak of lice is rather obvious. It goes like this: a group of teens needs a quick selfie fix; teens crowd around the smartphone and pose; teens lean in, heads together; head lice jump from one scalp to the next.

From the Independent:

Selfies have sparked an explosion in the number of head lice cases among teenagers a group of US paediatricians has warned.

The group said there is a growing trend of “social media lice” where lice spread when teenagers cram their heads together to take a selfie.

Lice cannot jump so they are less common in older children who do not tend to swap hats or headgear.

A Wisconsin paediatrician, Dr Sharon Rink, told local news channel WBAY2 she has seen a surge of teenagers coming to see her for treatment, something which was unheard of five years ago.

Dr Rink said: “People are doing selfies like every day, as opposed to going to photo booths years and years ago.

“So you’re probably having much more contact with other people’s heads.

“If you have an extremely itchy scalp and you’re a teenager, you might want to get checked out for lice instead of chalking it up to dandruff.”

In its official online guide to preventing the spread of head lice, the Center for Disease Control recommends avoiding head-to-head contact where possible and suggests girls are more likely to get the parasite than boys because they tend to have “more frequent head-to-head contact”.

Read (and scratch) more here.

Image courtesy of Google Search.

 

Don’t Call Me; I’ll Not Call You Either

google-search-telephone

We all have smartphones, but the phone call is dead. That tool of arcane real-time conversation between two people (sometimes more) is making way for asynchronous sharing via text, image and other data.

From the Atlantic:

One of the ironies of modern life is that everyone is glued to their phones, but nobody uses them as phones anymore. Not by choice, anyway. Phone calls—you know, where you put the thing up to your ear and speak to someone in real time—are becoming relics of a bygone era, the “phone” part of a smartphone turning vestigial as communication evolves, willingly or not, into data-oriented formats like text messaging and chat apps.

The distaste for telephony is especially acute among Millennials, who have come of age in a world of AIM and texting, then gchat and iMessage, but it’s hardly limited to young people. When asked, people with a distaste for phone calls argue that they are presumptuous and intrusive, especially given alternative methods of contact that don’t make unbidden demands for someone’s undivided attention. In response, some have diagnosed a kind of telephoniphobia among this set. When even initiating phone calls is a problem—and even innocuous ones, like phoning the local Thai place to order takeout—then anxiety rather than habit may be to blame: When asynchronous, textual media like email or WhatsApp allow you to intricately craft every exchange, the improvisational nature of ordinary, live conversation can feel like an unfamiliar burden. Those in power sometimes think that this unease is a defect in need of remediation, while those supposedly afflicted by it say they are actually just fine, thanks very much.

But when it comes to taking phone calls and not making them, nobody seems to have admitted that using the telephone today is a different material experience than it was 20 or 30 (or 50) years ago, not just a different social experience. That’s not just because our phones have also become fancy two-way pagers with keyboards, but also because they’ve become much crappier phones. It’s no wonder that a bad version of telephony would be far less desirable than a good one. And the telephone used to be truly great, partly because of the situation of its use, and partly because of the nature of the apparatus we used to refer to as the “telephone”—especially the handset.

On the infrastructural level, mobile phones operate on cellular networks, which route calls between between transceivers distributed across a service area. These networks are wireless, obviously, which means that signal strength, traffic, and interference can make calls difficult or impossible. Together, these factors have made phone calls synonymous with unreliability. Failures to connect, weak signals that staccato sentences into bursts of signal and silence, and the frequency of dropped calls all help us find excuses not to initiate or accept a phone call.

By contrast, the traditional, wired public switched telephone network (PSTN) operates by circuit switching. When a call is connected, one line is connected to another by routing it through a network of switches. At first these were analog signals running over copper wire, which is why switchboard operators had to help connect calls. But even after the PSTN went digital and switching became automated, a call was connected and then maintained over a reliable circuit for its duration. Calls almost never dropped and rarely failed to connect.

But now that more than half of American adults under 35 use mobile phones as their only phones, the intrinsic unreliability of the cellular network has become internalized as a property of telephony. Even if you might have a landline on your office desk, the cellular infrastructure has conditioned us to think of phone calls as fundamentally unpredictable affairs. Of course, why single out phones? IP-based communications like IM and iMessage are subject to the same signal and routing issues as voice, after all. But because those services are asynchronous, a slow or failed message feels like less of a failure—you can just regroup and try again. When you combine the seemingly haphazard reliability of a voice call with the sense of urgency or gravity that would recommend a phone call instead of a Slack DM or an email, the risk of failure amplifies the anxiety of unfamiliarity. Telephone calls now exude untrustworthiness from their very infrastructure.

Going deeper than dropped connections, telephony suffered from audio-signal processing compromises long before cellular service came along, but the differences between mobile and landline phone usage amplifies those challenges, as well. At first, telephone audio was entirely analogue, such that the signal of your voice and your interlocutor’s would be sent directly over the copper wire. The human ear can hear frequencies up to about 20 kHz, but for bandwidth considerations, the channel was restricted to a narrow frequency range called the voice band, between 300 and 3,400 Hz. It was a reasonable choice when the purpose of phones—to transmit and receive normal human speech—was taken into account.

By the 1960s, demand for telephony recommended more efficient methods, and the transistor made it both feasible and economical to carry many more calls on a single, digital circuit. The standard that was put in place cemented telephony’s commitment to the voice band, a move that would reverberate in the ears of our mobile phones a half-century later.

In order to digitally switch calls, the PSTN became subject to sampling, the process of converting a continuous signal to a discrete one. Sampling is carried out by capturing snapshots of a source signal at a specific interval. A principle called the Nyquist–Shannon sampling theorem specifies that a waveform of a particular maximum frequency can be reconstructed from a sample taken at twice that frequency per second. Since the voice band required only 4 kHz of bandwidth, a sampling rate of 8 kHz (that is, 8,000 samples per second) was established by Bell Labs engineers for a voice digitization method. This system used a technique developed by Bernard Oliver, John Pierce, and Claude Shannon in the late ‘40s called Pulse Code Modulation (PCM). In 1962, Bell began deploying PCM into the telephone-switching network, and the 3 kHz range for telephone calls was effectively fixed.

Since the PSTN is still very much alive and well and nearly entirely digitized save for the last mile, this sampling rate has persisted over the decades. (If you have a landline in an older home, its signal is probably still analog until it reaches the trunk of your telco provider.) Cellular phones still have to interface with the ordinary PSTN, so they get sampled in this range too.

Two intertwined problems arise. First, it turns out that human voices may transmit important information well above 3,300 Hz or even 5,000 Hz. The auditory neuroscientist Brian Monson has conducted substantial research on high-frequency energy perception. A widely-covered 2011 study showed that subjects could still discern communicative information well above the frequencies typically captured in telephony. Even though frequencies above 5,000 Hz are too high to transmit clear spoken language without the lower frequencies, Monson’s subjects could discern talking from singing and determine the sex of the speaker with reasonable accuracy, even when all the signal under 5,000 Hz was removed entirely. Monson’s study shows that 20th century bandwidth and sampling assumptions may already have made incorrect assumptions about how much of the range of human hearing was use for communication by voice.

That wasn’t necessarily an issue until the second part of the problem arises: the way we use mobile phones versus landline phones. When the PSTN was first made digital, home and office phones were used in predictable environments: a bedroom, a kitchen, an office. In these circumstances, telephony became a private affair cut off from the rest of the environment. You’d close the door or move into the hallway to conduct a phone call, not only for the quiet but also for the privacy. Even in public, phones were situated out-of-the-way, whether in enclosed phone booths or tucked away onto walls in the back of a diner or bar, where noise could be minimized.

Today, of course, we can and do carry our phones with us everywhere. And when we try to use them, we’re far more likely to be situated in an environment that is not compatible with the voice band—coffee shops, restaurants, city streets, and so forth. Background noise tends to be low-frequency, and, when it’s present, the higher frequencies that Monson showed are more important than we thought in any circumstance become particularly important. But because digital sampling makes those frequencies unavailable, we tend not to be able to hear clearly. Add digital signal loss from low or wavering wireless signals, and the situation gets even worse. Not only are phone calls unstable, but even when they connect and stay connected in a technical sense, you still can’t hear well enough to feel connected in a social one. By their very nature, mobile phones make telephony seem unreliable.

Read the entire story here.

Image courtesy of Google Search.

Your Goldfish is Better Than You

Common_goldfish

Well, perhaps not at philosophical musings or mathematics. But, your little orange aquatic friend now has an attention span that is longer than yours. And, it’s all thanks to mobile devices and multi-tasking on multiple media platforms. [Psst, by the way, multi-tasking at the level of media consumption is a fallacy]. On average, the adult attention span is now down to a laughingly paltry 8 seconds, whereas the lowly goldfish comes in at 9 seconds. Where of course that leaves your inbetweeners and teenagers is anyone’s guess.

From the Independent:

Humans have become so obsessed with portable devices and overwhelmed by content that we now have attention spans shorter than that of the previously jokingly juxtaposed goldfish.

Microsoft surveyed 2,000 people and used electroencephalograms (EEGs) to monitor the brain activity of another 112 in the study, which sought to determine the impact that pocket-sized devices and the increased availability of digital media and information have had on our daily lives.

Among the good news in the 54-page report is that our ability to multi-task has drastically improved in the information age, but unfortunately attention spans have fallen.

In 2000 the average attention span was 12 seconds, but this has now fallen to just eight. The goldfish is believed to be able to maintain a solid nine.

“Canadians [who were tested] with more digital lifestyles (those who consume more media, are multi-screeners, social media enthusiasts, or earlier adopters of technology) struggle to focus in environments where prolonged attention is needed,” the study reads.

“While digital lifestyles decrease sustained attention overall, it’s only true in the long-term. Early adopters and heavy social media users front load their attention and have more intermittent bursts of high attention. They’re better at identifying what they want/don’t want to engage with and need less to process and commit things to memory.”

Anecdotely, many of us can relate to the increasing inability to focus on tasks, being distracted by checking your phone or scrolling down a news feed.

Another recent study by the National Centre for Biotechnology Information and the National Library of Medicine in the US found that 79 per cent of respondents used portable devices while watching TV (known as dual-screening) and 52 per cent check their phone every 30 minutes.

Read the entire story here.

Image: Common Goldfish. Public Domain.

 

Circadian Misalignment and Your Smartphone

Google-search-smartphone-night

You take your portable electronics everywhere, all the time. You watch TV with or on your smartphone. You eat with a fork in one hand and your smartphone in the other. In fact, you probably wish you had two pairs of arms so you could eat, drink and use your smartphone and laptop at the same time. You use your smartphone in your car — hopefully or sensibly not while driving. You read texts on your smartphone while in the restroom. You use it at the movie theater, at the theater (much to the dismay of stage actors). It’s with you at the restaurant, on the bus or metro, in the aircraft, in the bath (despite chances of getting electrically shocked). You check your smartphone first thing in the morning and last thing before going to sleep. And, if your home or work-life demands you will check it periodically throughout the night.

Let’s leave aside for now the growing body of anecdotal and formal evidence that smartphones are damaging your physical wellbeing. This includes finger, hand and wrist problems (from texting); and neck and posture problems (from constantly bending over your small screen). Now there is evidence that constant use, especially at night, is damaging your mental wellbeing and increasing the likelihood of additional, chronic physical ailments. It appears that the light from our constant electronic companions is not healthy, particularly as it disrupts our regular rhythm of sleep.

From Wired:

For More than 3 billion years, life on Earth was governed by the cyclical light of sun, moon and stars. Then along came electric light, turning night into day at the flick of a switch. Our bodies and brains may not have been ready.

A fast-growing body of research has linked artificial light exposure to disruptions in circadian rhythms, the light-triggered releases of hormones that regulate bodily function. Circadian disruption has in turn been linked to a host of health problems, from cancer to diabetes, obesity and depression. “Everything changed with electricity. Now we can have bright light in the middle of night. And that changes our circadian physiology almost immediately,” says Richard Stevens, a cancer epidemiologist at the University of Connecticut. “What we don’t know, and what so many people are interested in, are the effects of having that light chronically.”

Stevens, one of the field’s most prominent researchers, reviews the literature on light exposure and human health the latest Philosophical Transactions of the Royal Society B. The new article comes nearly two decades after Stevens first sounded the alarm about light exposure possibly causing harm; writing in 1996, he said the evidence was “sparse but provocative.” Since then, nighttime light has become even more ubiquitous: an estimated 95 percent of Americans regularly use screens shortly before going to sleep, and incandescent bulbs have been mostly replaced by LED and compact fluorescent lights that emit light in potentially more problematic wavelengths. Meanwhile, the scientific evidence is still provocative, but no longer sparse.

As Stevens says in the new article, researchers now know that increased nighttime light exposure tracks with increased rates of breast cancer, obesity and depression. Correlation isn’t causation, of course, and it’s easy to imagine all the ways researchers might mistake those findings. The easy availability of electric lighting almost certainly tracks with various disease-causing factors: bad diets, sedentary lifestyles, exposure to they array of chemicals that come along with modernity. Oil refineries and aluminum smelters, to be hyperbolic, also blaze with light at night.

Yet biology at least supports some of the correlations. The circadian system synchronizes physiological function—from digestion to body temperature, cell repair and immune system activity—with a 24-hour cycle of light and dark. Even photosynthetic bacteria thought to resemble Earth’s earliest life forms have circadian rhythms. Despite its ubiquity, though, scientists discovered only in the last decade what triggers circadian activity in mammals: specialized cells in the retina, the light-sensing part of the eye, rather than conveying visual detail from eye to brain, simply signal the presence or absence of light. Activity in these cells sets off a reaction that calibrates clocks in every cell and tissue in a body. Now, these cells are especially sensitive to blue wavelengths—like those in a daytime sky.

But artificial lights, particularly LCDs, some LEDs, and fluorescent bulbs, also favor the blue side of the spectrum. So even a brief exposure to dim artificial light can trick a night-subdued circadian system into behaving as though day has arrived. Circadian disruption in turn produces a wealth of downstream effects, including dysregulation of key hormones. “Circadian rhythm is being tied to so many important functions,” says Joseph Takahashi, a neurobiologist at the University of Texas Southwestern. “We’re just beginning to discover all the molecular pathways that this gene network regulates. It’s not just the sleep-wake cycle. There are system-wide, drastic changes.” His lab has found that tweaking a key circadian clock gene in mice gives them diabetes. And a tour-de-force 2009 study put human volunteers on a 28-hour day-night cycle, then measured what happened to their endocrine, metabolic and cardiovascular systems.

Crucially, that experiment investigated circadian disruption induced by sleep alteration rather than light exposure, which is also the case with the many studies linking clock-scrambling shift work to health problems. Whether artificial light is as problematic as disturbed sleep patterns remains unknown, but Stevens thinks that some and perhaps much of what’s now assumed to result from sleep issues is actually a function of light. “You can wake up in the middle of the night and your melatonin levels don’t change,” he says. “But if you turn on a light, melatonin starts falling immediately. We need darkness.” According to Stevens, most people live in a sort of “circadian fog.”

Read the entire article here.

Image courtesy of Google Search.

Cellphone Only Lanes

slow-walking-lane

You’ve seen the high occupancy vehicle lane on select highways. You’ve seen pedestrian only zones. You’ve seen cycle friendly zones. Now, it’s time for the slow walking lane — for pedestrians using smartphones! Perhaps we’ll eventually see separate lanes for tourists with tablets, smartwatch users and, of course, a completely separate zone for texting t(w)eens.

From the Independent:

The Chinese city of Chongqing claims to have introduced the world’s first ‘slow-walking lane’ for smartphone users.

No more will the most efficient of pedestrians be forced to stare frustratedly at the occiput of their meandering counterparts.

Two 100-ft lanes have been painted on to a pavement in the city, with one side reserved for those wanting to stare into their handheld device and the other exclusively for those who can presumably spare five minutes without checking their latest Weibo update.

However, according to the Telegraph, officials in Chongqing only introduced the signage to make the point that “it is best not to play with your phone while walking”.

Read the entire story here.

Image: City of Chongqing. Courtesy of the Independent.

 

Mesh Networks: Coming to a Phone Near You

firechat-screenshot

Soon you’ll be able to text and chat online without the need of a cellular network or the Internet. There is a catch though: you’ll need yet another chat-app for your smartphone and you will need to be within a 100 or so yards of your chatting friend. But, this is just the beginning of so-called “mesh networks” that can be formed through peer-to-peer device connections avoiding the need for cellular communications. As mobile devices continue to proliferate such local, device-to-device connections could become more practical.

From Technology Review:

Mobile app stores are stuffed with messaging apps from WhatsApp to Tango and their many imitators. But FireChat, released last week for the iPhone, stands out. It’s the only one that can be used without cell-phone reception.

FireChat makes use of a feature Apple introduced in the latest version of its iOS mobile software, iOS7, called multipeer connectivity. This feature allows phones to connect to one another directly using Bluetooth or Wi-Fi as an alternative to the Internet. If you’re using FireChat, its “nearby” chat room lets you exchange messages with other users within 100 feet without sending data via your cellular provider.

Micha Benoliel, CEO and cofounder of startup Open Garden, which made FireChat, says the app shows how smartphones can be set free from cellular networks. He hopes to enable many more Internet-optional apps with the upcoming release of software tools that will help developers build FireChat-style apps for iPhone, or for Android, Mac, and Windows devices. “This approach is very interesting for multiplayer gaming and all kinds of communication apps,” says Benoliel.

Anthony DiPasquale, a developer with consultancy Thoughtbot, says FireChat is the only app he’s aware of that’s been built to make use of multipeer connectivity, perhaps because the feature remains unfamiliar to most Apple developers. “I hope more people start to use it soon,” he says. “It’s an awesome framework with a lot of potential. There is probably a great use for multipeer connectivity in every situation where there are people grouped together wanting to share some sort of information.” DiPasquale has dabbled in using multipeer connectivity himself, creating an experimental app that streams music from one device to several others nearby.

The new feature of iOS7 currently only supports data moving directly from one device to another, and from one device to several others. However, Open Garden’s forthcoming software will extend the feature so that data can hop between two iPhones out of range of one another via intermediary devices. That approach, known as mesh networking, is at the heart of several existing projects to create disaster-proof or community-controlled communications networks (see “Build Your Own Internet with Mobile Mesh Networking”).

Apps built to exploit such device-to-device schemes can offer security and privacy benefits over those that rely on the Internet. For example, messages sent using FireChat to nearby devices don’t pass through any systems operated by either Open Garden or a wireless carrier (although they are broadcast to all FireChat users nearby).

That means the content of a message and metadata could not be harvested from a central communications hub by an attacker or government agency. “This method of communication is immune to firewalls like the ones installed in China and North Korea,” says Mattt Thompson, a software engineer who writes the iOS and Mac development blog NSHipster. Recent revelations about large-scale surveillance of online services and the constant litany of data breaches make this a good time for apps that don’t rely on central servers, he says. “As users become more mindful of the security and privacy implications of technologies they rely on, moving in the direction of local, ad-hoc networking makes a lot of sense.”

However, peer-to-peer and mesh networking apps also come with their own risks, since an eavesdropper could gain access to local traffic just by using a device within range.

Read the entire article here.

Image courtesy of Open Garden.

Big Data Knows What You Do and When

Data scientists are getting to know more about you and your fellow urban dwellers as you move around your neighborhood and your city. As smartphones and cell towers become more ubiquitous and  data collection and analysis gathers pace researchers (and advertisers) will come to know your daily habits and schedule rather intimately. So, questions from a significant other along the lines of, “and, where were you at 11:15 last night?” may soon be consigned to history.

From Technology Review:

Mobile phones have generated enormous insight into the human condition thanks largely to the study of the data they produce. Mobile phone companies record the time of each call, the caller and receiver ids, as well as the locations of the cell towers involved, among other things.

The combined data from millions of people produces some fascinating new insights in the nature of our society.

Anthropologists have crunched it to reveal human reproductive strategiesa universal law of commuting and even the distribution of wealth in Africa.

Today, computer scientists have gone one step further by using mobile phone data to map the structure of cities and how people use them throughout the day. “These results point towards the possibility of a new, quantitative classification of cities using high resolution spatio-temporal data,” say Thomas Louail at the Institut de Physique Théorique in Paris and a few pals.

They say their work is part of a new science of cities that aims to objectively measure and understand the nature of large population centers.

These guys begin with a database of mobile phone calls made by people in the 31 Spanish cities that have populations larger than 200,000. The data consists of the number of unique individuals using a given cell tower (whether making a call or not) for each hour of the day over almost two months.

Given the area that each tower covers, Louail and co work out the density of individuals in each location and how it varies throughout the day. And using this pattern, they search for “hotspots” in the cities where the density of individuals passes some specially chosen threshold at certain times of the day.

The results reveal some fascinating patterns in city structure. For a start, every city undergoes a kind of respiration in which people converge into the center and then withdraw on a daily basis, almost like breathing. And this happens in all cities. This “suggests the existence of a single ‘urban rhythm’ common to all cities,” says Louail and co.

During the week, the number of phone users peaks at about midday and then again at about 6 p.m. During the weekend the numbers peak a little later: at 1 p.m. and 8 p.m. Interestingly, the second peak starts about an hour later in western cities, such as Sevilla and Cordoba.

The data also reveals that small cities tend to have a single center that becomes busy during the day, such as the cities of Salamanca and Vitoria.

But it also shows that the number of hotspots increases with city size; so-called polycentric cities include Spain’s largest, such as Madrid, Barcelona, and Bilboa.

That could turn out to be useful for automatically classifying cities.

Read the entire article here.

How Apple With the Help of Others Invented the iPhone

Apple’s invention of the iPhone is story of insight, collaboration, cannibalization and dogged persistence over the period of a decade.

[div class=attrib]From Slate:[end-div]

Like many of Apple’s inventions, the iPhone began not with a vision, but with a problem. By 2005, the iPod had eclipsed the Mac as Apple’s largest source of revenue, but the music player that rescued Apple from the brink now faced a looming threat: The cellphone. Everyone carried a phone, and if phone companies figured out a way to make playing music easy and fun, “that could render the iPod unnecessary,” Steve Jobs once warned Apple’s board, according to Walter Isaacson’s biography.

Fortunately for Apple, most phones on the market sucked. Jobs and other Apple executives would grouse about their phones all the time. The simplest phones didn’t do much other than make calls, and the more functions you added to phones, the more complicated they were to use. In particular, phones “weren’t any good as entertainment devices,” Phil Schiller, Apple’s longtime marketing chief, testified during the company’s patent trial with Samsung. Getting music and video on 2005-era phones was too difficult, and if you managed that, getting the device to actually play your stuff was a joyless trudge through numerous screens and menus.

That was because most phones were hobbled by a basic problem—they didn’t have a good method for input. Hard keys (like the ones on the BlackBerry) worked for typing, but they were terrible for navigation. In theory, phones with touchscreens could do a lot more, but in reality they were also a pain to use. Touchscreens of the era couldn’t detect finger presses—they needed a stylus, and the only way to use a stylus was with two hands (one to hold the phone and one to hold the stylus). Nobody wanted a music player that required two-handed operation.

This is the story of how Apple reinvented the phone. The general outlines of this tale have been told before, most thoroughly in Isaacson’s biography. But the Samsung case—which ended last month with a resounding victory for Apple—revealed a trove of details about the invention, the sort of details that Apple is ordinarily loath to make public. We got pictures of dozens of prototypes of the iPhone and iPad. We got internal email that explained how executives and designers solved key problems in the iPhone’s design. We got testimony from Apple’s top brass explaining why the iPhone was a gamble.

Put it all together and you get remarkable story about a device that, under the normal rules of business, should not have been invented. Given the popularity of the iPod and its centrality to Apple’s bottom line, Apple should have been the last company on the planet to try to build something whose explicit purpose was to kill music players. Yet Apple’s inner circle knew that one day, a phone maker would solve the interface problem, creating a universal device that could make calls, play music and videos, and do everything else, too—a device that would eat the iPod’s lunch. Apple’s only chance at staving off that future was to invent the iPod killer itself. More than this simple business calculation, though, Apple’s brass saw the phone as an opportunity for real innovation. “We wanted to build a phone for ourselves,” Scott Forstall, who heads the team that built the phone’s operating system, said at the trial. “We wanted to build a phone that we loved.”

The problem was how to do it. When Jobs unveiled the iPhone in 2007, he showed off a picture of an iPod with a rotary-phone dialer instead of a click wheel. That was a joke, but it wasn’t far from Apple’s initial thoughts about phones. The click wheel—the brilliant interface that powered the iPod (which was invented for Apple by a firm called Synaptics)—was a simple, widely understood way to navigate through menus in order to play music. So why not use it to make calls, too?

In 2005, Tony Fadell, the engineer who’s credited with inventing the first iPod, got hold of a high-end desk phone made by Samsung and Bang & Olufsen that you navigated using a set of numerical keys placed around a rotating wheel. A Samsung cell phone, the X810, used a similar rotating wheel for input. Fadell didn’t seem to like the idea. “Weird way to hold the cellphone,” he wrote in an email to others at Apple. But Jobs thought it could work. “This may be our answer—we could put the number pad around our clickwheel,” he wrote. (Samsung pointed to this thread as evidence for its claim that Apple’s designs were inspired by other companies, including Samsung itself.)

Around the same time, Jonathan Ive, Apple’s chief designer, had been investigating a technology that he thought could do wonderful things someday—a touch display that could understand taps from multiple fingers at once. (Note that Apple did not invent multitouch interfaces; it was one of several companies investigating the technology at the time.) According to Isaacson’s biography, the company’s initial plan was to the use the new touch system to build a tablet computer. Apple’s tablet project began in 2003—seven years before the iPad went on sale—but as it progressed, it dawned on executives that multitouch might work on phones. At one meeting in 2004, Jobs and his team looked a prototype tablet that displayed a list of contacts. “You could tap on the contact and it would slide over and show you the information,” Forstall testified. “It was just amazing.”

Jobs himself was particularly taken by two features that Bas Ording, a talented user-interface designer, had built into the tablet prototype. One was “inertial scrolling”—when you flick at a list of items on the screen, the list moves as a function of how fast you swipe, and then it comes to rest slowly, as if being affected by real-world inertia. Another was the “rubber-band effect,” which causes a list to bounce against the edge of the screen when there were no more items to display. When Jobs saw the prototype, he thought, “My god, we can build a phone out of this,” he told the D Conference in 2010.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Retro design iPhone courtesy of Ubergizmo.[end-div]