The CIA has had a long history in Hollywood. During the 1950s, CIA asset Luigi G Luraschi used his position as head of censorship at Paramount Studios to bring film content in line with the Agency’s ideals. Scenes that portrayed the US in a bad light were cut; films such as High Noon (1952) were prevented from receiving certain industry awards; and well dressed ‘negroes’ were placed in lavish on-screen environments to suggest that the US didn’t have a race problem. In order to tame or otherwise subvert their content, the CIA also covertly assisted on the film adaptations of George Orwell’s Nineteen Eighty-Four (1954) and Animal Farm (1955), as well as Graham Greene’s The Quiet American (1958). Continue reading
Tag Archives: George Orwell
Amazon Secretly Deletes “1984″ From the Kindle
(Silver Underground) -Imagine this. You’re reading George Orwell’s “1984,” a distopian novel about a tyrannical media state that controls the minds of the people by constantly altering the available language and information. Then, when you’re about a third through the book, and it’s just getting good, it disappears down the memory hole without explanation.
Unbeknownst to many, apparently Amazon has the ability to remotely edit or remove content from people’s Kindle e-book readers. What an ironic choice for the debut of this power.
Justin Gawronski, 17, is suing Amazon after his purchased copy of 1984 was surreptitiously removed from his reader without warning. Frankly, I was amazed to read this high school still assigned 1984. I thought it was removed from the standard curriculum a few years ago, but Gawronski is reportedly in an advanced placement course, so maybe that’s why.
This has resulted in a class action lawsuit against Amazon seeking damages for everyone who had their e-books deleted, but more importantly it seeks a ban on future deletions. Both Orwell’s distopian classics, “1984″ and “Animal Farm” were removed from the Kindle Store, and retroactively from customers’ devices, allegedly over an intellectual property dispute.
Gawronki wrote “I thought that once purchased, the books were mine,” and Jay Edelson, the Chicago lawyer who filed the lawsuit, “Amazon.com had no more right to hack into people’s Kindles than its customers have the right to hack into Amazon’s bank account to recover a mistaken over-payment.” That certainly makes sense to me, but in my experience the state never misses an opportunity to treat 1984 like an instruction manual instead of a cautionary tale.
Amazon’s actions could have far-reaching consequences for state intervention in private media consumption. Is it that hard to imagine? The state long ago decided that library records were a matter of National Security, and most Internet and cellphone service providers hand over your records if the state asks for them. Is it that far fetched to imagine the state would take similar interest in our digital libraries? And if certain books are deemed a matter of National Security, doesn’t it just make sense that they might delete them? For security of course.
Edelson added, “Technology companies increasingly feel that because they have the ability to access people’s personal property, they have the right to do so.” But doesn’t that exactly describe the government?
It’s potentially spookier than just secret delicious. Periodically the state issues so-called “National Security Letters” which are both a demand to cooperate in an investigation by turning over private records and personal data, and a gag order, preventing the recipient from disclosing the content of the letter.
How long before we boot up our Kindle and discover that the state has ordered Amazon to replace segments of library with National Security Letters? For security of course.
Knowing something about e-publishing, I know that it’s possible for authors to make changes to their e-book and correct errors in the updated version. This is really handy because it means missing a typo in the editorial process doesn’t mean you have 10,000 misprints of a book. You can fix it. But doesn’t it also mean that it’s possible for Amazon or the state to make subtle changes to e-books that you may never even notice?
These practices may not be legal yet, but since when has law prevented the state from grabbing power? The point is the technology exists, so it’s just a matter of time, just one national crisis away.
This all has startling implications about literature in general. If we can’t control the literature we supposedly own, e-books represent a dramatic potential destruction of knowledge. In the past, when totalitarian ideologies came to power they burned books, and rebels preserved manuscripts in secret, even when the sentence was death. The next totalitarian regime will have a kill switch on every book… So perk up rebels. If you’ve got books that you want to survive the coming collapse, start backing up your files someplace safe.
Find out where you can see Silver Circle by checking our theater and special screening schedule on our event page.
IBM: Computers Will See, Hear, Taste, Smell And Touch In 5 Years
(Beforeitsnews) -IBM (NYSE: IBM) unveiled the seventh annual “IBM 5 in 5″ a list of innovations that have the potential to change the way people work, live and interact during the next five years.
The IBM 5 in 5 is based on market and societal trends as well as emerging technologies from IBM’s R&D labs around the world that can make these transformations possible.
This year’s IBM 5 in 5 explores innovations that will be the underpinnings of the next era of computing, which IBM describes as the era of cognitive systems. This new generation of machines will learn, adapt, sense and begin to experience the world as it really is. This year’s predictions focus on one element of the new era, the ability of computers to mimic the human senses—in their own way, to see, smell, touch, taste and hear.
These sensing capabilities will help us become more aware, productive and help us think – but not think for us. Cognitive computing systems will help us see through complexity, keep up with the speed of information, make more informed decisions, improve our health and standard of living, enrich our lives and break down all kinds of barriers—including geographic distance, language, cost and inaccessibility.
“IBM scientists around the world are collaborating on advances that will help computers make sense of the world around them,” said Bernie Meyerson, IBM Fellow and VP of Innovation. “Just as the human brain relies on interacting with the world using multiple senses, by bringing combinations of these breakthroughs together, cognitive systems will bring even greater value and insights, helping us solve some of the most complicated challenges.”
Physical Analytics Research Manager Hendrik Hamann examines an array of wireless sensors used to detect environmental conditions such as temperature, humidity, gases and chemicals at IBM Research headquarters in Yorktown Heights, NY, Monday, December 17, 2012. In five years, technology advancements could enable sensors to analyze odors or the molecules in a person’s breath to help diagnose diseases. This innovation is part of IBM’s 5 in 5, a set of IBM annual predictions that have the potential to change the way people work, live and interact during the next five years. (Jon Simon/Feature Photo Service for IBM)
Here are five predictions that will define the future:
Touch: You will be able to touch through your phone
Imagine using your smartphone to shop for your wedding dress and being able to feel the satin or silk of the gown, or the lace on the veil, all from the surface of the screen? Or to feel the beading and weave of a blanket made by a local artisan half way around the world? In five years, industries such as retail will be transformed by the ability to “touch” a product through your mobile device. IBM scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric — as a shopper brushes her finger over the image of the item on a device screen. Utilizing the vibration capabilities of the phone, every object will have a unique set of vibration patterns that represents the touch experience: short fast patterns, or longer and stronger strings of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material.
Current uses of haptic and graphic technology in the gaming industry take the end user into a simulated environment. The opportunity and challenge here is to make the technology so ubiquitous and inter-woven into everyday experiences that it brings greater context to our lives by weaving technology in front and around us. This technology will become ubiquitous in our everyday lives, turning mobile phones into tools for natural and intuitive interaction with the world around us.
Sight: A pixel will be worth a thousand words
We take 500 billion photos a year[1]. 72 hours of video is uploaded to YouTube every minute[2]. The global medical diagnostic imaging market is expected to grow to $26.6 billion by 2016[3].
Computers today only understand pictures by the text we use to tag or title them; the majority of the information — the actual content of the image — is a mystery. In the next five years, systems will not only be able to look at and recognize the contents of images and visual data, they will turn the pixels into meaning, beginning to make sense out of it similar to the way a human views and interprets a photograph. In the future, “brain-like” capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media. This will have a profound impact for industries such as healthcare, retail and agriculture.
Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies. What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images — such as differentiating healthy from diseased tissue — and correlating that with patient records and scientific literature, systems that can “see” will help doctors detect medical problems with far greater speed and accuracy.
Hearing: Computers will hear what matters
Ever wish you could make sense of the sounds all around you and be able to understand what’s not being said?
Within five years, a distributed system of clever sensors will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies. It will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will “listen” to our surroundings and measure movements, or the stress in a material, to warn us if danger lies ahead. Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other “modalities,” such as visual or tactile information, and classify and interpret the sounds based on what it has learned. When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognize patterns.
For example, “baby talk” will be understood as a language, telling parents or doctors what infants are trying to communicate. Sounds can be a trigger for interpreting a baby’s behavior or needs. By being taught what baby sounds mean – whether fussing indicates a baby is hungry, hot, tired or in pain – a sophisticated speech recognition system would correlate sounds and babbles with other sensory or physiological information such as heart rate, pulse and temperature.
In the next five years, by learning about emotion and being able to sense mood, systems will pinpoint aspects of a conversation and analyze pitch, tone and hesitancy to help us have more productive dialogues that could improve customer call center interactions, or allow us to seamlessly interact with different cultures.
Today, IBM scientists are beginning to capture underwater noise levels in Galway Bay, Irelandto understand the sounds and vibrations of wave energy conversion machines, and the impact on sea life, by using underwater sensors that capture sound waves and transmit them to a receiving system to be analyzed.
Taste: Digital taste buds will help you to eat smarter
What if we could make healthy foods taste delicious using a different kind of computing system that is built for creativity?
IBM researchers are developing a computing system that actually experiences flavor, to be used with chefs to create the most tasty and novel recipes. It will break down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavors and smells humans prefer. By comparing this with millions of recipes, the system will be able to create new flavor combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar, and dry-cured ham. A system like this can also be used to help us eat healthier, creating novel flavor combinations that will make us crave a vegetable casserole instead of potato chips.
The computer will be able to use algorithms to determine the precise chemical structure of food and why people like certain tastes. These algorithms will examine how chemicals interact with each other, the molecular complexity of flavor compounds and their bonding structure, and use that information, together with models of perception to predict the taste appeal of flavors.
Not only will it make healthy foods more palatable — it will also surprise us with unusual pairings of foods actually designed to maximize our experience of taste and flavor. In the case of people with special dietary needs such as individuals with diabetes, it would develop flavors and recipes to keep their blood sugar regulated, but satisfy their sweet tooth.
Smell: Computers will have a sense of smell
During the next five years, tiny sensors embedded in your computer or cell phone will detect if you’re coming down with a cold or other illness. By analyzing odors, biomarkers and thousands of molecules in someone’s breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes and epilepsy by detecting which odors are normal and which are not. Today IBM scientists are already sensing environmental conditions and gases to preserve works of art. This innovation is beginning to be applied to tackle clinical hygiene, one of the biggest challenges in healthcare today. For example, antibiotic-resistant bacteria such as Methicillin-resistant Staphylococcus aureus (MRSA), which in 2005 was associated with almost 19,000 hospital stay-related deaths in the United States, is commonly found on the skin and can be easily transmitted wherever people are in close contact. One way of fighting MRSA exposure in healthcare institutions is by ensuring medical staff follow clinical hygiene guidelines. In the next five years, IBM technology will “smell” surfaces for disinfectants to determine whether rooms have been sanitized. Using novel wireless “mesh” networks, data on various chemicals will be gathered and measured by sensors, and continuously learn and adapt to new smells over time.
Due to advances in sensor and communication technologies in combination with deep learning systems, sensors can measure data in places never thought possible. For example, computer systems can be used in agriculture to “smell” or analyze the soil condition of crops. In urban environments, this technology will be used to monitor issues with refuge, sanitation and pollution – helping city agencies spot potential problems before they get out of hand.
5 Unbelievably Creepy Surveillance Tactics
(Alternet) -Since the erosion of Americans’ civil liberties depends on high levels of public apathy, some of the most dangerous privacy breaches take place incrementally and under the radar; if it invites comparisons to Blade Runner or Orwell, then someone in the PR department didn’t do their job. Meanwhile, some of the biggest threats to privacy, like insecure online data or iPhone GPS tracking, are physically unobtrusive and therefore easily ignored. And it’ll beat least a year or two until the sky is overrun by spy drones.
So when a method of surveillance literally resembles a prop or plot point in a sci-fi movie, it helps to reveal just how widespread and sophisticated commercial and government monitoring has become. Here are five recent developments that seem almost unreal in their dystopian creepiness.
1. Buses and street cars that can hear what you say.
You can’t really go anywhere in America without being tracked by surveillance cameras. But seeing what people do is not enough; according to a report by the Daily, cities all over the country are literally bugging public transportation.
In San Francisco, city officials have plans to install surveillance cameras that record sound on 357 buses and trolley cars, the Daily reported. Eugene, Oregon and Columbus, Hartford and Athens, Georgia, also have audio recording plans in the works. The systems have the capacity to filter background noise and hone in on passengers’ conversations.
Officials have said that the system is merely intended to help resolve disputes between bus riders. San Francisco officials did not comment, but the Daily found a similar justification in procurement documents for the technology. “The purpose of this project is to replace the existing video surveillance systems in SFMTA’s fleet of revenue vehicles with a reliable and technologically advanced system to increase passenger safety and improve reliability and maintainability of the system.”
It’s nice that the Department of Homeland Security, which covered the entire cost of San Francisco’s system, is so committed to ensuring pleasant bus rides for passengers.
2. Mannequins that can see you.
A handful of retailers in the US and Europe are installing mannequins in their stores that can determine customers’ age, gender and race, Bloomberg reported last month. Don’t worry, the face recognition-equipped camera is hidden, so there is no way to tell whether the giant plastic dolls in the store are watching you as you shop. The company that developed the mannequins (named EyeSee) sells their attributes thusly:
This special camera installed inside the mannequin’s head analyzes the facial features of people passing through the front and provides statistical and contextual information useful to the development of targeted marketing strategies. The embedded software can also provide other data such as the number of people passing in front of a window at certain times of the day.
They are also developing audio technology that can pick up key words from customer conversations to help them tailor their marketing plans. A screen that displays advertising geared specifically to each customers’ demographic is also in EyeSee’s future.
Really, wouldn’t the ideal marketing scenario be if human customers were replaced by mannequins programmed to buy everything the other mannequins were selling?
3. Biometric time clocks.
For too long, employers lacked the ability to extract every second of labor from their workers with scientific precision. Thanks to the wonders of face recognition technology, many employees in low-wage workplaces are now required to log in to work on face recognition readers instead of using key cards or codes. Biometric time clocks like FaceIn, most commonly used at construction sites, create an avatar of the workers’ face that the machine keeps forever and that ages alongside the employee. Allegedly, it can tell twins apart.
Meanwhile, many fast food restaurants and retailers have started using biometric time clocks that record digital fingerprints, like the creepily named U.are.U digital fingerprint reader, to prevent employees from coming in late or giving out discounts.
4. Tagging children.
It’s probably best to train people in robotic discipline early, and many US schools, aided by surveillance technology vendors, are on it. Last month, a Texas sophomore sued her school district for making students carry RFID chips that tracked their movements, but that’s just the start. School administrators all over the country use CCTV cameras, RFID chips, and GPS tracking to moniter where students go and what they do, as David Rosen reported for AlterNet. One pilot program for middle schoolers used GPS to make sure kids aren’t late:
Each school day, the delinquent students get an automated “wake-up” phone call reminding them that they need to get to school on time. In addition, five times a day they are required to enter a code that tracks their locations: as they leave for school, when they arrive at school, at lunchtime, when they leave school and at 8pm. These students are also assigned an adult “coach” who calls them at least three times a week to see how they are doing and help them find effective ways to make sure they get to school.
5. Biometric databases.
Federal agencies ranging from the DoD to the FBI to the DHS are revamping their databases to include iris scans, voice patterning, measures of gait, face recognition, and records of scars and tattoos. They also have a mandate to indiscriminately share this information between agencies and with unnamed foreign entities.