Comments on a cultural reality between past and future.

This blog describes Metatime in the Posthuman experience, drawn from Sir Isaac Newton's secret work on the future end of times, a tract in which he described Histories of Things to Come. His hidden papers on the occult were auctioned to two private buyers in 1936 at Sotheby's, but were not available for public research until the 1990s.

Saturday, September 21, 2013

Equinox Timescapes

Pearls Centre Sunset, 2013. Singapore Time Dimensions Series. Image Source: fqwimages © artist Fong Qi Wei.

To mark the balance of time occurring with today's Equinox (at 20:44 UTC), see some urban timescape photos today which simultaneously depict day and night, sunrise and sunset, the rush and the quiet. See more photos from Fong Qi Wei's Singaporean timescape series here. See similar American kaleidoscope timescape photos below the jump by Michael Shainblum. These are stills from a short film called Mirror City:
Mirror City is the latest video from photographer and filmmaker Michael Shainblum that takes time-lapse footage of Chicago, San Francisco, San Diego, Las Vegas and Los Angeles and runs it through a constantly shifting kaleidoscopic pattern of mirrors. Shainblum says of the piece took about four months to edit and adds:
These clips were all processed from their original form, into the kaleidoscopic visuals that you see in this video. Many people visit these large cities every day, and all of these places have been shot and filmed, but I wanted to emulate these urban landscapes in a way that nobody has even seen before. I wanted to put man-made geometric shapes, mixed with elements of color and movement to create less of a structured video, and more of a plethora of visual stimulation.
(Hat tip: This is Colossal).

National Day Preview 02, 2013. Singapore Time Dimensions Series. Image Source: fqwimages © artist Fong Qi Wei.

Tiong Bahru HDB Sunset, 2013. Singapore Time Dimensions Series. Image Source: fqwimages © artist Fong Qi Wei.

Guns and Circuses: The New 3D Printing Search Engine

Cody Wilson. Image Source: Wired.

Caption for the above photograph: "Cody Wilson, a 24-year-old law student at the University of Texas, didn't invent the concept of printable, downloadable guns. He's only created the first platform devoted to sharing the blueprints online for free to anyone who wants one, anywhere in the world, at any time. Wilson and his group of amateur gunsmiths, known as Defense Distributed, are also currently working on producing what may become the world's first fully 3-D printed gun, which they call the 'Wiki Weapon.'"

The Gen Y entrepreneurs who brought us the 3D printed gun (primarily designer Cody Wilson, whom Wired named as one of "The 15 Most Dangerous People in the World.") did not like government authorities censoring them. So they've invented a new search engine devoted to 3D print model schematics. Oh, and it functions like the Pirate Bay.

Friday, September 20, 2013

Faux Antiques and Real Antiques

Image Source: Metafilter.

Not far from my house, there is a big Victorian house going up for auction at the end of the month. The owner is an antiques dealer. The word is that an old lady lived in the house for decades. When she died, the dealer picked it up for a song. He renovated it, kitted it out like a museum, and is now selling the contents and property; and I guess, he is hoping to turn it over for a profit. I saw the property at the auctioneer's open house, and what got me thinking about antiques was the basement.

Thursday, September 19, 2013

One Way Ticket to Mars

The Mars One white coffee mug. Image Source: Mars One.

A One Way Ticket to Mars. Would You Like to Know More? The Mars One competition, in which the whole world was invited to apply for a one way ticket to Mars, ended on 31 August 2013 (see my earlier post here). Now the organizers are sifting through the applicants, whom you can see here. If all goes as planned, the winners will be sent off to the Red Planet in 2023. The project is crowd-funded. You are asked by the organizers to buy a coffee mug or other souvenirs to support the colonization effort.

Wednesday, September 18, 2013

Cells That Reverse the Arrow of Evolutionary Time

Fission yeast aka Schizosaccharomyces pombe. Image Source: University of Tübingen.
Catastrophic failure or progressive decline? These are alternatives in cellular degeneration. For example, some cells, such as cancer cells, do not age. One commenter at Naked Science Forum notes: "some mutations which cause cancer are not actually causing excessive cell division but a mutation upon the gene which controls programmed cell death... so they don't die when they should and you thus end up with accumulation."

Similarly, researchers have found a type of yeast that does not age (that is, it does not show cellular damage and wear as cells divide over time), but rather, it gets younger as its cells divide. These particular yeast cells do die, but as a result of sudden, catastrophic failure at any given moment, rather than through a progressive decline.
Under favorable conditions, the microbe, a species of yeast called S. pombe, does not age the way other microbes do, the researchers said.

Typically, when single-celled organisms divide in half, one half acquires the majority of older, often damaged cell material, while the other half acquires mostly new cell material.

But in the new study, researchers found that under favorable, nonstressful growing conditions, S. pombe (a single-celled organism) divided in such a way that both halves acquired about equal parts of old cell material. "As both cells get only half of the damaged material, they are both younger than before," study researcher Iva Tolic-Nørrelykke, of the Max Planck Institute of Molecular Cell Biology and Genetics in Germany, said in a statement.

What's more, previous research has shown that when cells divide and continuously pass on old cell material, the cells that get the old material start to divide more slowly — a sign of aging. This has been seen in microorganisms such E. coli and the yeast S. cerevisiae.

But in the new study, S. pombe cells showed no increase in the time it took for them to divide, the researchers said.

That's not to say that S. pombe cells don't die. Some cells did die in the study, but the deaths occurred suddenly, as a result of a catastrophic failure of a cellular process, rather than aging, the researchers said.

The researchers said they are not arguing that any given component of S. pombe cells are immortal. If a particular component of a cell is followed for a long enough time, the researchers believe the cell that harbors this component will eventually die. But "the probability of this death will be constant rather than increasing over time," the researchers wrote in the Sept. 12 issue of the journal Current Biology.

During unfavorable, stressful conditions, S. pombe cells distribute old cell material unevenly, and the cells that inherited the old material eventually died, the study found. Also, during stressful conditions, S. pombe showed an increase in division time.

Although there's no way to know for sure why the researchers did not detect aging in S. pombe under favorable conditions, one likely explanation is that the cellular damage is being repaired at the same rate that it's being formed, said Eric Stewart, a microbiologist at Northeastern University in Boston, who was not involved in the study.

But just because the study researchers did not detect aging in favorable conditions doesn't meant that it's not occurring. "They're trying to show the absence of something," in this case, aging, Stewart said. "Showing the absence of something is a nearly impossible challenge," he said.

S. pombe growth under favorable conditions could potentially serve as a model of nonaging cell types, such as cancer cells, the researchers said.
On the logic of non-ageing cancer cells, I have seen reports that cancer cells are resistant to radiation. Researchers ask: did this condition arise in reaction to radioactive treatments? Or does cancer's radioresistance precede radiation treatments? The conventional wisdom is that cancer involves a genetic predisposition that is triggered by an external factor. Is cancer a body's misguided reaction against radiation, other pollutants in the environment, or viruses? I have seen reports that cancer cells burn sugar, unlike normal cells, which burn oxygen - which is an argument to stop eating sugar if I ever saw one. Is the way cancer works - or the way other non-ageing cells work - the grim key to immortality?

Researcher Paul Davies - author of The Goldilocks Enigma - wrote a 2012 report for The Guardian to ask if cancer is actually a way that a multi-cellular organism can regress to the single-celled organism model, where cells do not seem to age. Thus, he postulates, cancer essentially reverses the normal course of evolution from single cell to multicellular organism, even as the disease reverses the clock on cell death processes. But the question remains: why does cancer do this? What purpose is an evolutionary reversal trying to serve? Davies and an Australian physicist, Charles Lineweaver, maintain that cancer de-evolves a sufferer of the disease at the cellular level. The disease serves to activate increasingly archaic genes in a body as it spreads. Lineweaver claims that cancer is a "default cellular safe mode." From The Guardian report:
In the frantic search for an elusive "cure", few researchers stand back and ask a very basic question: why does cancer exist? What is its place in the grand story of life? Astonishingly, in spite of decades of research, there is no agreed theory of cancer, no explanation for why, inside almost all healthy cells, there lurks a highly efficient cancer subroutine that can be activated by a variety of agents – radiation, chemicals, inflammation and infection.
Cancer, it seems, is embedded in the basic machinery of life, a type of default state that can be triggered by some kind of insult. That suggests it is not a modern aberration but has deep evolutionary roots, a suspicion confirmed by the fact that it is not confined to humans but is widespread among mammals, fish, reptiles and even plants. Scientists have identified genes implicated in cancer that are thought to be hundreds of millions of years old. Clearly, we will fully understand cancer only in the context of biological history.
Two relevant evolutionary transitions stand out. The first occurred over 2 billion years ago, when large, complex cells emerged containing mitochondria – tiny factories that supply energy to the cell. Biologists think mitochondria are the remnants of ancient bacteria. Tellingly, they undergo systematic changes as cancer develops, profoundly altering their chemical and physical properties.
For most of Earth's history, life was confined to single-celled organisms. Over time, however, a new possibility arose. Earth's atmosphere became polluted by a highly toxic and reactive chemical – oxygen – created as a waste product of photosynthesis. Cells evolved ingenious strategies to either avoid the accumulating oxygen or to combat oxidative damage in their innards. But some organisms turned a vice into a virtue and found a way to exploit oxygen as a potent new source of energy. In modern organisms, it is mitochondria that harness this dangerous substance to power the cell.
With the appearance of energised oxygen-guzzling cells, the way lay open for the second major transition relevant to cancer – the emergence of multicellular organisms. This required a drastic change in the basic logic of life. Single cells have one imperative – to go on replicating. In that sense, they are immortal. But in multicelled organisms, ordinary cells have outsourced their immortality to specialised germ cells – sperm and eggs – whose job is to carry genes into future generations. The price that the ordinary cells pay for this contract is death; most replicate for a while, but all are programmed to commit suicide when their use-by date is up, a process known as apoptosis. And apoptosis is also managed by mitochondria.
Cancer involves a breakdown of the covenant between germ cells and the rest. Malignant cells disable apoptosis and make a bid for their own immortality, forming tumours as they start to overpopulate their niches. In this sense, cancer has long been recognised as a throwback to a "selfish cell" era. But recent advances in research permit us to embellish this picture. For example, cancer cells thrive in low-oxygen (even zero-oxygen) conditions, reverting to an earlier, albeit less efficient, form of metabolism known as fermentation.
Biologists are familiar with the fact that organisms may harbour ancient traits that reflect their ancestral past, such as the atavistic tails or supernumerary nipples some people are born with. Evolution necessarily builds on earlier genomes. Sometimes older genetic pathways are not discarded, just silenced. Atavisms result when something disrupts the silencing mechanism.
Charles Lineweaver, of the Australian National University, and I have proposed a theory of cancer based on its ancient evolutionary roots. We think that as cancer progresses in the body it reverses, in a speeded-up manner, the arrow of evolutionary time. Increasing deregulation prompts cancer cells to revert to ever earlier genetic pathways that recapitulate successively earlier ancestral life styles. We predict that the various hallmarks of cancer progression will systematically correlate with the activation of progressively older ancestral genes. The most advanced and malignant cancers recreate aspects of life on Earth before a billion years ago.
Ancient genes remain functional only if they continue to fulfill a biological purpose. In early-stage embryo development, when the basic body plan is laid down (also in low-oxygen conditions, incidentally) ancestral genes help guide developmental processes before being switched off. Every human, for example, possesses tails and gills for a time in the womb. Significantly, researchers have recently identified examples of early-stage embryonic genes being reawakened in cancer.
The deep links between evolutionary biology, developmental biology and cancer have huge implications for therapy, and also provide an unexpected reason to study cancer. By unravelling the details of cancer initiation and progression, scientists can open a window on the past through which we can gain tantalising glimpses of life in a bygone age.
You can see a further article for online from Lineweaver in Physics World at http://www.physicsworld.com/cws/download/jul2013. This is a special issue made free to the public, which deals with the physics of cancer.

Tuesday, September 17, 2013

Class, Work and Time

Image Source: Work Mommy Work.

Danish researchers have determined that social class (and associated jobs) determine how parents perceive time. Parents' perceptions of time tend to influence what subjects their children study. The parents' class distinction is shaped by the work-pay differences between the professional salary and the hourly wage. The Parent Herald reports:
The study "The Educational Strategies of Danish University Students from Professional and Working-Class Backgrounds" is based on 60 interviews with Danish students from six different university level study programmes: Medicine, architecture, sociology, economy, pharmacy and business studies.
Young people of parents with university degrees choose a 24-hour culture
... "For young people whose parents are university educated, factors such as prestige and a strong sense of professional identity are important. They are attracted by an educational culture in which you are a student 24/7, and where leisure activities are tied to the identity that lies within your studies. These young people have also grown up with topical discussions around the dinner table which also prepares them for their lives as students," says [co-author] Jens Peter Thomsen.
Young people from working class backgrounds choose '9 to 5' studies
When young people from working class homes with good grades in their A-level exams choose other paths than the prestigious studies, it is, among other things, due to the fact that they want a clearly defined aim of their studies.

"The young people who are first-generation university students often choose studies that are more '9 to 5' and less tied up to a sense of identity. They have lower academic expectations of themselves, and they choose studies with a clearly defined goal for their professional lives," in sectors where jobs are easily found.

They do not choose to study, e.g. sociology because it can be difficult to know what it might lead to jobwise, says the education sociologist. ... [Jens Peter Thomsen also] mentions that medical students from families of doctors may have a different view of the patient than a young person with a working class background, who also chooses to study medicine.
Who knows how generally true these findings are, given the study's limited numbers? At any rate, the researchers suggest that middle class students associate a broader understanding of time and work with their identity; and their job choices are secondary to that identity. That is, their work may be their life first, and their livelihood second.

On the other hand, students from working class families tend to take a pragmatic view, with a more constricted association between work and time; nor do they associate their personal identity with that more limited view. In other words, they may seek to shape personal identity outside their pragmatic job choices.

The researchers also found that the class-determined perception of time often persists across a generation, regardless of whether the graduates demonstrate upward or downward mobility. Finally, the study indicates that these class distinctions hold true, whether the university education in question is public and subsidized (and access to all fields of education is more equal) - or private and very expensive (and access to all fields of education is unequal).

Monday, September 16, 2013

The Coming Siege Against Cognitive Liberties

Image Source: Nature.

The Chronicle of Higher Education reports on how researchers are debating the legal implications of technological advances in neuroscanning:
Imagine that psychologists are scanning a patients' brain, for some basic research purpose. As they do so, they stumble across a fleeting thought that their equipment is able to decode: The patient has committed a murder, or is thinking of committing one soon. What would the researchers be obliged to do with that information?

That hypothetical was floated a few weeks ago at the first meeting of the Presidential Commission for the Study of Bioethical Issues devoted to exploring societal and ethical issues raised by the government's Brain initiative (Brain Research through Advancing Innovative Neurotechnologies), which will put some $100-million in 2014 alone into the goal of mapping the brain. ...

One commissioner ... has been exploring precisely those sorts of far-out scenarios. Will brain scans undermine traditional notions of privacy? Are existing constitutional protections sufficient to guard our freedom of thought, or are new laws required as fMRI scanners and EEG detectors grow evermore precise?

Asking those questions is the Duke University associate professor of law Nita A. Farahany ... . "We have this idea of privacy that includes the space around our thoughts, which we only share with people we want to ... . Neuroscience shows that what we thought of as this zone of privacy can be breached." In one recent law-review article, she warned against a "coming siege against cognitive liberties."

Her particular interest is in how brain scans reshape our understanding of, or are checked by, the Fourth and Fifth Amendments of the Constitution. Respectively, they protect against "unreasonable searches and seizures" and self-incrimination, which forbids the state to turn any citizen into "a witness against himself." Will "taking the Fifth," a time-honored tactic in American courtrooms, mean anything in a world where the government can scan your brain? The answer may depend a lot on how the law comes down on another question: Is a brain scan more like an interview or a blood test? ...

Berkeley's [Jack] Gallant says that although it will take an unforeseen breakthrough, "assuming that science keeps marching on, there will eventually be a radar detector that you can point at somebody and it will read their brain cognitions remotely." ...

Moving roughly from less protected to more protected ... [Farahany's] categories [for reading the brain in legal terms] are: identifying information, automatic information (produced by the brain or body without effort or conscious thought), memorialized information (that is, memories), and uttered information. (Contrary to idiomatic usage, her "uttered" information can include information uttered only in the mind. At the least, she observes, we may need stronger Miranda warnings, specifying that what you say, even silently to yourself, can be used against you.) ...

In a book to be published next month, Mind, Brains, and Law: The Conceptual Foundations of Law and Neuroscience (Oxford University Press), Michael S. Pardo and Dennis Patterson directly confront Farahany's work. They argue that her evidence categories do not necessarily track people's moral intuitions—that physical evidence can be even more personal than thought can. "We assume," they write, "that many people would expect a greater privacy interest in the content of information about their blood"—identifying or automatic information, like HIV status—"than in the content of their memories or evoked utterances on a variety of nonpersonal matters."

On the Fifth Amendment question, the two authors "resist" the notion that a memory could ever be considered analogous to a book or an MP3 file and be unprotected, the idea Farahany flirts with. And where the Fourth Amendment is concerned, Pardo, a professor of law at the University of Alabama, writes in an e-mail, "I do think that lie-detection brain scans would be treated like blood draws." ...

[Farahany] says ... her critics are overly concerned with the "bright line" of physical testimony: "All of them are just grappling with current doctrine. What I'm trying to do is reimagine and newly conceive of how we think of doctrine."

"The bright line has never worked," she continues. "Truthfully, there are things that fall in between, and a better thing to do is to describe the levels of in-betweenness than to inappropriately and with great difficulty assign them to one category or another."

Among those staking out the brightest line is Paul Root Wolpe, a professor of bioethics at Emory University. "The skull," he says, "should be an absolute zone of privacy." He maintains that position even for the scenario of the suspected terrorist and the ticking time bomb, which is invariably raised against his position.
"As Sartre said, the ultimate power or right of a person is to say, 'No,'" Wolpe observes. "What happens if that right is taken away—if I say 'No' and they strap me down and get the information anyway? I want to say the state never has a right to use those technologies." ... Farahany stakes out more of a middle ground, arguing that, as with most legal issues, the interests of the state need to be balanced against those of the individual. ...

The nonprotection of automatic information, she writes, amounts to "a disturbing secret lurking beneath the surface of existing doctrine." Telephone metadata, another kind of automatic information, can, after all, be as revealing as GPS tracking.

Farahany starts by showing how the secrets in our brains are threatened by technology. She winds up getting us to ponder all the secrets that a digitally savvy state can gain access to, with silky and ominous ease.
Much of the discussion among these legal researchers involves thinking about cognitive legal issues (motivations, actions, memories) in a way that is strongly influenced by computer-based metaphors. This is part of the new transhuman bias, evident in many branches of research. This confirms a surprising point: posthumanism is not some future hypothetical reality, where we all have chips in our brains and are cybernetically enhanced. It is an often-unconsidered way of life for people who are still 100 per cent human; it is a way that they are seeing the world.

This is the 'soft' impact of high technology, where there is an automatic assumption that we, our brains, or the living world around us, are like computers, with data which can be manipulated and downloaded.

In other words, it is not just the hard gadgetry of technological advances that initiates these insidious changes in law and society. If we really want to worry about the advent of a surveillance state, we must question the general mindset of tech industry designers, and people in general, who are unconsciously mimicking computers in the way they try to understand the world. From this unconscious mimicry comes changes to society for which computers are not technically responsible.

A false metaphorical correlation between human and machine - the expectation that organic lives must be artificially automated  - is corrosive to the assumptions upon which functioning societies currently still rest. These assumptions are what Farahany would call, "current doctrine." We take 'current doctrine' for granted. But at the same time, we now take for granted ideas that make 'current doctrine' increasingly impossible to maintain.

This is not to say that change is unnecessary or that technology has not brought vast improvements.

But is it really necessary for everything to go faster and faster? Do we need to be accessible to everyone and everything, day and night? Should our bosses, the police, the government, corporations, the media, let alone other citizens, know everything we do and think in private? Do we really need more powerful technology every six months? Why is it necessary that our gadgets increasingly become smaller, more intimate, and physically attached to us? Why is it not compulsory for all of us to learn (by this point) to a standard level how computers are built and how they are programmed?

We are accepting technology as a given, without second guessing the core assumptions driving that acceptance. Because we are not questioning what seems 'obvious,' researchers press on, in an equally unconsidered fashion, with their expectations that a total surveillance state is inevitable.

Sunday, September 15, 2013

A Year in the Life of a Tree

The Denver Post reports on a man who photographed a Bur Oak every day for one year and posted the photos on Facebook; he started on 24 March 2012. His experience showed that the simple act of slowing down and carefully looking at one other living thing can change one's whole perception of the world:
There is a tree that stands alone among the cornfields - about 5 miles south of Platteville, Wisconsin in the southwest corner of the state. Photographer Mark Hirsch drove by it almost every day for 19 years and never once stopped to take a picture. Then one day, he did. ...
“It was never easy and it never came naturally,” writes Hirsch. “But when I found that scene, situation or moment that made me comfortable that I had made a worthy picture for the day, it was incredibly rewarding personally. At some point, I really began to appreciate the contemplative nature of my visits to that tree.”
At first mention, a year in the life of a tree might not immediately sound interesting, visually or otherwise. Hirsch’s pictures, however, uncover a complex web of life and color surrounding the tree.
“I would describe that Tree as I would a friend,” writes Hirsch. “My initial description a year ago would have been as simple as a tree in a corn field, but now I would describe it as a tree of life in its own realm.”
“I was never very good at slowing down but I am now. I’ve learned to see things differently. And I’ve embraced an incredible appreciation for the land in and around that tree.”
By the 365th day, the project had become so renowned on Facebook that "on March 23, 2013, Hirsch took the last official pictures of the project ... [and a]lmost 300 people (and 12 dogs) showed up for a group photo under the branches of that tree [below]. Some devoted fans even drove in from Milwaukee, Chicago and northern Minnesota to be in the picture." See some of the photos below the jump, more or less in chronological order from spring 2012 to spring 2013 (they are taken from the Denver Post report or from Facebook); and the Facebook page with the full album here. Hirsch also published his photos in a book. All photos are © Mark Hirsch and are reproduced here under Fair Use for non-commercial review and discussion.

Group photo (23 March 2013), last day of the project.