Scientific Totalitarianism (our future post 2020) - Page 3 - Politics Forum.org | PoFo

Wandering the information superhighway, he came upon the last refuge of civilization, PoFo, the only forum on the internet ...

Theories and happenings too odd for the main forums.
#14861083
Billionaire ex-Facebook president Sean Parker unloads on Mark Zuckerberg and admits he helped build a monster

Sean Parker, the first president of Facebook, has a disturbing warning about the social network: "God only knows what it's doing to our children's brains."

Speaking to the news website Axios, the entrepreneur and executive talked openly about what he perceives as the dangers of social media and how it exploits human "vulnerability."

"The thought process that went into building these applications, Facebook being the first of them ... was all about: 'How do we consume as much of your time and conscious attention as possible?'" said Parker, who joined Facebook in 2004, when it was less than a year old.

"And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever," he told Axios. "And that's going to get you to contribute more content, and that's going to get you ... more likes and comments."

Parker added: "It's a social-validation feedback loop ... exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology."

"The inventors, creators — it's me, it's Mark [Zuckerberg], it's Kevin Systrom on Instagram, it's all of these people — understood this consciously," he said. "And we did it anyway."

Facebook did not immediately respond to Business Insider's request for comment.

Some in tech are growing disillusioned — and worried

Parker isn't the only tech figure to express disillusionment and worry by what they helped create. Tristan Harris, a former Google employee, has been outspoken in his criticism of how tech companies' products hijack users' minds.

"If you're an app, how do you keep people hooked? Turn yourself into a slot machine," he wrote in a widely shared Medium post in 2016.

"We need our smartphones, notifications screens and web browsers to be exoskeletons for our minds and interpersonal relationships that put our values, not our impulses, first," he continued. "People's time is valuable. And we should protect it with the same rigor as privacy and other digital rights."

In a recent feature, The Guardian spoke to tech workers and industry figures who have been critical of Silicon Valley business practices.

Loren Brichter, the designer who created the slot-machine-like pull-down-to-refresh mechanism now widely used on smartphones, said, "I've spent many hours and weeks and months and years thinking about whetheranything I've done has made a net positive impact on society or humanity at all."

Brichter added: "Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I'm not saying I'm mature now, but I'm a little bit more mature, and I regret the downsides."

And Roger McNamee, an investor in Facebook and Google, told The Guardian: "The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences ... The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models."

The comments from Parker and others are further evidence of souring public sentiment about Silicon Valley. Once lauded in utopian terms, companies like Facebook have now come under heavy criticism over their role in the spread of "fake news" and Russian propaganda.

http://www.businessinsider.com/ex-faceb ... ty-2017-11
I'd like to point out- changing or abandoning social medias advertising models will not stop the beast... The medium is the message.

Why the modern world is bad for your brain

Although we think we’re doing several things at once, multitasking, this is a powerful and diabolical illusion. Earl Miller, a neuroscientist at MIT and one of the world experts on divided attention, says that our brains are “not wired to multitask well… When people think they’re multitasking, they’re actually just switching from one task to another very rapidly. And every time they do, there’s a cognitive cost in doing so.” So we’re not actually keeping a lot of balls in the air like an expert juggler; we’re more like a bad amateur plate spinner, frantically switching from one task to another, ignoring the one that is not right in front of us but worried it will come crashing down any minute. Even though we think we’re getting a lot done, ironically, multitasking makes us demonstrably less efficient.

Multitasking has been found to increase the production of the stress hormone cortisol as well as the fight-or-flight hormone adrenaline, which can overstimulate your brain and cause mental fog or scrambled thinking. Multitasking creates a dopamine-addiction feedback loop, effectively rewarding the brain for losing focus and for constantly searching for external stimulation. To make matters worse, the prefrontal cortex has a novelty bias, meaning that its attention can be easily hijacked by something new – the proverbial shiny objects we use to entice infants, puppies, and kittens. The irony here for those of us who are trying to focus amid competing activities is clear: the very brain region we need to rely on for staying on task is easily distracted. We answer the phone, look up something on the internet, check our email, send an SMS, and each of these things tweaks the novelty- seeking, reward-seeking centres of the brain, causing a burst of endogenous opioids (no wonder it feels so good!), all to the detriment of our staying on task. It is the ultimate empty-caloried brain candy. Instead of reaping the big rewards that come from sustained, focused effort, we instead reap empty rewards from completing a thousand little sugar-coated tasks.

In the old days, if the phone rang and we were busy, we either didn’t answer or we turned the ringer off. When all phones were wired to a wall, there was no expectation of being able to reach us at all times – one might have gone out for a walk or been between places – and so if someone couldn’t reach you (or you didn’t feel like being reached), it was considered normal. Now more people have mobile phones than have toilets. This has created an implicit expectation that you should be able to reach someone when it is convenient for you, regardless of whether it is convenient for them. This expectation is so ingrained that people in meetings routinely answer their mobile phones to say, “I’m sorry, I can’t talk now, I’m in a meeting.” Just a decade or two ago, those same people would have let a landline on their desk go unanswered during a meeting, so different were the expectations for reachability.

Just having the opportunity to multitask is detrimental to cognitive performance. Glenn Wilson, former visiting professor of psychology at Gresham College, London, calls it info-mania. His research found that being in a situation where you are trying to concentrate on a task, and an email is sitting unread in your inbox, can reduce your effective IQ by 10 points. And although people ascribe many benefits to marijuana, including enhanced creativity and reduced pain and stress, it is well documented that its chief ingredient, cannabinol, activates dedicated cannabinol receptors in the brain and interferes profoundly with memory and with our ability to concentrate on several things at once. Wilson showed that the cognitive losses from multitasking are even greater than the cognitive losses from pot‑smoking.

Russ Poldrack, a neuroscientist at Stanford, found that learning information while multitasking causes the new information to go to the wrong part of the brain. If students study and watch TV at the same time, for example, the information from their schoolwork goes into the striatum, a region specialised for storing new procedures and skills, not facts and ideas. Without the distraction of TV, the information goes into the hippocampus, where it is organised and categorised in a variety of ways, making it easier to retrieve. MIT’s Earl Miller adds, “People can’t do [multitasking] very well, and when they say they can, they’re deluding themselves.” And it turns out the brain is very good at this deluding business.

Then there are the metabolic costs that I wrote about earlier. Asking the brain to shift attention from one activity to another causes the prefrontal cortex and striatum to burn up oxygenated glucose, the same fuel they need to stay on task. And the kind of rapid, continual shifting we do with multitasking causes the brain to burn through fuel so quickly that we feel exhausted and disoriented after even a short time. We’ve literally depleted the nutrients in our brain. This leads to compromises in both cognitive and physical performance. Among other things, repeated task switching leads to anxiety, which raises levels of the stress hormone cortisol in the brain, which in turn can lead to aggressive and impulsive behaviour. By contrast, staying on task is controlled by the anterior cingulate and the striatum, and once we engage the central executive mode, staying in that state uses less energy than multitasking and actually reduces the brain’s need for glucose.


To make matters worse, lots of multitasking requires decision-making: Do I answer this text message or ignore it? How do I respond to this? How do I file this email? Do I continue what I’m working on now or take a break? It turns out that decision-making is also very hard on your neural resources and that little decisions appear to take up as much energy as big ones. One of the first things we lose is impulse control. This rapidly spirals into a depleted state in which, after making lots of insignificant decisions, we can end up making truly bad decisions about something important. Why would anyone want to add to their daily weight of information processing by trying to multitask?

In discussing information overload with Fortune 500 leaders, top scientists, writers, students, and small business owners, email comes up again and again as a problem. It’s not a philosophical objection to email itself, it’s the mind-numbing number of emails that come in. When the 10-year-old son of my neuroscience colleague Jeff Mogil (head of the Pain Genetics lab at McGill University) was asked what his father does for a living, he responded, “He answers emails.” Jeff admitted after some thought that it’s not so far from the truth. Workers in government, the arts, and industry report that the sheer volume of email they receive is overwhelming, taking a huge bite out of their day. We feel obliged to answer our emails, but it seems impossible to do so and get anything else done.

Before email, if you wanted to write to someone, you had to invest some effort in it. You’d sit down with pen and paper, or at a typewriter, and carefully compose a message. There wasn’t anything about the medium that lent itself to dashing off quick notes without giving them much thought, partly because of the ritual involved, and the time it took to write a note, find and address an envelope, add postage, and take the letter to a mailbox. Because the very act of writing a note or letter to someone took this many steps, and was spread out over time, we didn’t go to the trouble unless we had something important to say. Because of email’s immediacy, most of us give little thought to typing up any little thing that pops in our heads and hitting the send button. And email doesn’t cost anything.

Sure, there’s the money you paid for your computer and your internet connection, but there is no incremental cost to sending one more email. Compare this with paper letters. Each one incurred the price of the envelope and the postage stamp, and although this doesn’t represent a lot of money, these were in limited supply – if you ran out of them, you’d have to make a special trip to the stationery store and the post office to buy more, so you didn’t use them frivolously. The sheer ease of sending emails has led to a change in manners, a tendency to be less polite about what we ask of others. Many professionals tell a similar story. One said, “A large proportion of emails I receive are from people I barely know asking me to do something for them that is outside what would normally be considered the scope of my work or my relationship with them. Email somehow apparently makes it OK to ask for things they would never ask by phone, in person, or in snail mail.”


There are also important differences between snail mail and email on the receiving end. In the old days, the only mail we got came once a day, which effectively created a cordoned-off section of your day to collect it from the mailbox and sort it. Most importantly, because it took a few days to arrive, there was no expectation that you would act on it immediately. If you were engaged in another activity, you’d simply let the mail sit in the box outside or on your desk until you were ready to deal with it. Now email arrives continuously, and most emails demand some sort of action: Click on this link to see a video of a baby panda, or answer this query from a co-worker, or make plans for lunch with a friend, or delete this email as spam. All this activity gives us a sense that we’re getting things done – and in some cases we are. But we are sacrificing efficiency and deep concentration when we interrupt our priority activities with email.

Until recently, each of the many different modes of communication we used signalled its relevance, importance, and intent. If a loved one communicated with you via a poem or a song, even before the message was apparent, you had a reason to assume something about the nature of the content and its emotional value. If that same loved one communicated instead via a summons, delivered by an officer of the court, you would have expected a different message before even reading the document. Similarly, phone calls were typically used to transact different business from that of telegrams or business letters. The medium was a clue to the message. All of that has changed with email, and this is one of its overlooked disadvantages – because it is used for everything. In the old days, you might sort all of your postal mail into two piles, roughly corresponding to personal letters and bills. If you were a corporate manager with a busy schedule, you might similarly sort your telephone messages for callbacks. But emails are used for all of life’s messages. We compulsively check our email in part because we don’t know whether the next message will be for leisure/amusement, an overdue bill, a “to do”, a query… something you can do now, later, something life-changing, something irrelevant.

This uncertainty wreaks havoc with our rapid perceptual categorisation system, causes stress, and leads to decision overload. Every email requires a decision! Do I respond to it? If so, now or later? How important is it? What will be the social, economic, or job-related consequences if I don’t answer, or if I don’t answer right now?

Now of course email is approaching obsolescence as a communicative medium. Most people under the age of 30 think of email as an outdated mode of communication used only by “old people”. In its place they text, and some still post to Facebook. They attach documents, photos, videos, and links to their text messages and Facebook posts the way people over 30 do with email. Many people under 20 now see Facebook as a medium for the older generation.


For them, texting has become the primary mode of communication. It offers privacy that you don’t get with phone calls, and immediacy you don’t get with email. Crisis hotlines have begun accepting calls from at-risk youth via texting and it allows them two big advantages: they can deal with more than one person at a time, and they can pass the conversation on to an expert, if needed, without interrupting the conversation.

But texting suffers from most of the problems of email and then some. Because it is limited in characters, it discourages thoughtful discussion or any level of detail. And the addictive problems are compounded by texting’s hyperimmediacy. Emails take some time to work their way through the internet and they require that you take the step of explicitly opening them. Text messages magically appear on the screen of your phone and demand immediate attention from you. Add to that the social expectation that an unanswered text feels insulting to the sender, and you’ve got a recipe for addiction: you receive a text, and that activates your novelty centres. You respond and feel rewarded for having completed a task (even though that task was entirely unknown to you 15 seconds earlier). Each of those delivers a shot of dopamine as your limbic system cries out “More! More! Give me more!”

In a famous experiment, my McGill colleagues Peter Milner and James Olds, both neuroscientists, placed a small electrode in the brains of rats, in a small structure of the limbic system called the nucleus accumbens. This structure regulates dopamine production and is the region that “lights up” when gamblers win a bet, drug addicts take cocaine, or people have orgasms – Olds and Milner called it the pleasure centre. A lever in the cage allowed the rats to send a small electrical signal directly to their nucleus accumbens. Do you think they liked it? Boy how they did! They liked it so much that they did nothing else. They forgot all about eating and sleeping. Long after they were hungry, they ignored tasty food if they had a chance to press that little chrome bar; they even ignored the opportunity for sex. The rats just pressed the lever over and over again, until they died of starvation and exhaustion. Does that remind you of anything? A 30-year-old man died in Guangzhou (China) after playing video games continuously for three days. Another man died in Daegu (Korea) after playing video games almost continuously for 50 hours, stopped only by his going into cardiac arrest.

Each time we dispatch an email in one way or another, we feel a sense of accomplishment, and our brain gets a dollop of reward hormones telling us we accomplished something. Each time we check a Twitter feed or Facebook update, we encounter something novel and feel more connected socially (in a kind of weird, impersonal cyber way) and get another dollop of reward hormones. But remember, it is the dumb, novelty-seeking portion of the brain driving the limbic system that induces this feeling of pleasure, not the planning, scheduling, higher-level thought centres in the prefrontal cortex. Make no mistake: email-, Facebook- and Twitter-checking constitute a neural addiction.

https://www.theguardian.com/science/201 ... n-overload
#14878943
Technology and cultural incrementalism operate as the driving forces behind social change.

Nissan’s Brain-To-Vehicle Interface Could Make Driving Safer by Scanning Your Brain

The automaker announced at beginning of January 2018 that it is developing a Brain-to-Vehicle (B2V) interface that, if implemented, would increase a driver’s reaction times to make driving safer.

This human driver—semi-autonomous collaboration would see the latter predicting the former’s actions — be it turning the steering wheel or applying the brakes — by reading and interpreting their brain signals using Electroencephalography (EEG) technology. Upon doing so, the semi-autonomous vehicle would start those actions 0.2 to 0.5 seconds sooner. The automaker calls it, “Nissan Intelligent Mobility.” When it autonomous mode, the system could also adjust detect driver discomfort and adjust its driving style accordingly, or use augmented reality to alter what the driver sees.
https://futurism.com/nissans-brain-vehi ... ur-brains/


GM Will Launch Robocars Without Steering Wheels Next Year

After more than a century making vehicles for humans to drive, General Motors has ripped the heart out of its latest ride, and is now holding the grisly spectacle up for all the world to see: A car with no steering wheel. And it plans to put a fleet of these newfangled things to work in a taxi-like service, somewhere in the US, next year.

And no, this robo-chariot, a modified all-electric Chevrolet Bolt, doesn't have pedals either. This is GM's truly driverless debut, a car that will have to handle the world on its own. No matter what happens, you, dear human passenger, cannot help it now.

Terrifying? Maybe. But it's also a major step in GM’s aggressive bid to maintain its big dog status as the auto industry evolves away from individual ownership and flesh-and-blood drivers. And it’s just the beginning for the Detroit stalwart. “We’ve put together four generations of autonomous vehicles over the course of 18 months,” says Dan Ammann, GM’s president. “You can safely assume that the fourth generation won’t be the last.”
https://www.wired.com/story/gm-cruise-s ... unch-2019/


2018: The Year Blockchain, AI and IoT Converge,
But will it be centralized?


Decentralization, by its very nature, requires that more intelligence shifts to nodes instead of residing in one central server.

We will continue to see the development of semiconductors that are capable of advanced computing in smaller and smaller devices. As devices at the edge become smarter, the smart contracts enabled by blockchain platforms will work better with more advanced data analytics capabilities.

I see a mini-brain in each of our devices, ranging from simplistic ones to ones capable of processing larger datasets and making decisions based on that data.

The open availability of more data and smarter processing at the nodes will enable broader datasets available to more companies and people, instead of proprietary data ownership that currently exists within companies such as Facebook and Google. More importantly, that data will be diverse and representative of the world we live in, instead of being filtered by a few companies that reside in one geography.

While this may not all happen within the next year, we have started an inevitable march towards that future, one that will be even more transformative than the internet was.
https://www.coindesk.com/2018-year-bloc ... -converge/


Doctors will have to be more like actors as A.I. gains momentum:

Zocdoc CEO Oliver Kharraz fully expects machine learning to take over many clinical functions. For doctors, that means emotional intelligence is going to become increasingly important.

"Doctors in the future will come from the same pool as actors," said Kharraz, in an interview on Thursday at the J.P. Morgan Healthcare conference in San Francisco. Zocdoc's web-based software lets consumers book medical appointments online.

Kharraz, who is also a doctor by training, said there's going to be a shift in the type of personalities attracted to medicine as machines start doing things like diagnosing medical conditions by analyzing scans.

To be successful, doctors are going to need empathy and an ability to listen.
https://www.cnbc.com/2018/01/11/zocdoc- ... er-eq.html




Your doctor, your teacher, your authority



Technology = giving you a sense of purpose and community.

Things don't need to be physical they can be digital = You must be plugged in.
#14883619
For more information on the full-spectrum dominance/domestication of the human domain please check out my Laplace's Demon thread, viewtopic.php?t=166559

Please check out these other threads for more information:

Memories In DNA Project- viewtopic.php?f=70&t=172663

First primate clones created in Chinese laboratory- viewtopic.php?f=42&t=172662

Amazon has created a new computing platform that will future-proof your home

Amazon is in a better position than any other company to dominate ambient computing, the concept that everything in your life is computerized and intelligent.

But I think it's time to add one more category to the list: ambient computing, or the concept that there can be a layer of intelligence powering everything in your home from your lights to your thermostat. Many see this as a new phase of computing where our technology works for us automatically. We're in the early days of ambient computing, but there's already a clear front-runner powering its future: Amazon Alexa.

Read more: http://www.businessinsider.com/amazon-a ... ome-2018-1


Cracker-jack box stores everywhere
well known for being packaged with a prize of trivial value inside.


Amazon Officially Eliminates Cashiers...

Do you own a grocery store? You better pay attention. Same for you, merchants. As consumers get more and more used to self-service automated stores you’re going to need to respond. But don’t fire your employees yet – people still enjoy engaging with humans, so maybe you can figure out a balance between technology and human interaction.

Read more: https://www.forbes.com/sites/quickerbet ... d1e13c64f2


Intel’s quantum computing efforts take a major step forward

All of these efforts then, are baby steps along the way to true quantum computing. Intel isn't the only one pursuing this goal -- IBM happens to have a giant 50-qubit quantum computer hanging around at CES -- and the competition among tech giants to own this next generation of computing can only make it come more quickly.

Read more: https://www.engadget.com/2018/01/08/int ... p-forward/


5G phones expected in 2019 thanks to Chinese, Qualcomm pact

The reality of a 5G phone is closer than you think.

Mobile chip giant Qualcomm on Thursday announced a partnership with several of the largest Chinese phone manufacturers, including Lenovo (Motorola's parent), Xiaomi, ZTE, Oppo (OnePlus' owner) and Vivo, to build 5G phones as early as 2019.

Under the "5G Pioneer" initiative, Qualcomm will help create a platform for the companies to build phones running on 5G, the next generation of wireless technology that promises more speed and better coverage. 5G, one of the hottest trends in tech, is considered the foundation for a number of growing segments such as self-driving cars and artificial intelligence.

Read more: https://www.cnet.com/news/5g-phones-com ... oppo-vivo/
#14884761
As a successor to 4G LTE, 5G is expected to provide a reliable, ultrafast wireless connectivity to a growing industry of high-tech inventions such as self-driving cars and artificial intelligence — even “true AI enhanced networked combat,” according to the memo.

Meet the newest recruit of Dubai’s police force: Dubai will now police streets with self-driving robo-cars (with facial-recognition tech)

Earlier in June, Dubai Police inducted its first robotic police officer into its ranks. This month, city officials announced that autonomous police cars will begin patrolling the streets of Dubai by the end of the year to help identify and track suspects. Named the O-R3, the patrol car can navigate on its own using machine-learning algorithms and comes with a built-in aerial drone and facial-recognition technology to follow targets (and surveil areas and people) off-road. While the O-R3s will patrol the city on its own and use biometric software to scan individuals it comes across, the driverless car will still be controlled by the police remotely from behind a computer dashboard.

https://www.gqindia.com/content/meet-ne ... ice-force/


Image

Ford wants to patent a driverless police car that ambushes lawbreakers using artificial intelligence

Imagine a police car that issues tickets without even pulling you over.

What if the same car could use artificial intelligence to find good hiding spots to catch traffic violators and identify drivers by scanning license plates, tapping into surveillance cameras and wirelessly accessing government records?

What if a police officer tapping on your car window asking for your license and registration became a relic of transportation’s past?

The details may sound far-fetched, as if they belong in the science-fiction action flick “Demolition Man” or a new dystopian novel inspired by Aldous Huxley’s “Brave New World,” but these scenarios are grounded in a potential reality. They come from a patent developed by Ford and being reviewed by the U.S. government to create autonomous police cars. Ford’s patent application was published this month.
https://www.washingtonpost.com/news/inn ... 9ce1c51e58


Is there something “smart” about data centers? Is there anything unusual anymore about autonomous vehicle testing, which is underway in a handful of cities and states already, including throughout the Phoenix area? Popular Mechanics writes that “the community will integrate technology and high speed data into its infrastructure.” Technology in its infrastructure! The future has arrived.
https://slate.com/business/2017/11/bill ... gates.html
#14935315
Full-spectrum Dominance of the Human Domain
Information, Domestication, and Planned Obsolescence.


As you play political dungeons and dragons, pretend to be mythological labels, and rearrange old ideas... The JUGGERNAUT is rolls forward. The Juggernaut is always in the background, humming along.

Mood-management and genetic collateral damage is the technological standard. The puppet masters strings are now wireless.

5th Gen wireless communications

The big four national carriers are in a race to deploy next-generation 5G wireless networks, which should bring faster speeds, superior responsiveness and better coverage. 5G is seen as the foundational technology for areas like self-driving cars and streaming virtual reality, and it starts with these early deployments.

So far Verizon has mostly talked about its plans to roll out 5G as a broadband replacement, with only a limited mobile 5G service this year. Meanwhile, T-Mobile and Sprint, which have announced plans to merger, are setting things up now for a commercial launch early next year. AT&T has said it will launch 5G in a dozen cities this year.

But consumers won't see any real benefits of these deployments until 2019 when the first 5G-capable smartphones hit the market.

https://www.cnet.com/news/verizon-sees- ... 5g-launch/


WhatsApp leads mobs to murder

In India, false rumors about child kidnappers have gone viral on WhatsApp, prompting fearful mobs to kill two dozen innocent people since April.

The phenomenon is part of a trend of false information flooding social media in recent years, which has incited violence from Brazil to Sri Lanka.

The messages in India have preyed on a universal fear: harm coming to a child. Some of the false messages on WhatsApp described gangs of kidnappers on the prowl. Others included videos showing people driving up and snatching children.

https://www.wraltechwire.com/2018/07/21 ... -in-india/


^Just wait for diminished reality tech or next gen photo-shop to reshape perceptions.

Privatized and weaponized Genetic tech is on the horizon.

US military agency invests $100m in genetic extinction technologies

Technology could be used to wipe out malaria carrying mosquitos or other pests but UN experts say fears over possible military uses and unintended consequences strengthen case for a ban

Cutting-edge gene editing tools such as Crispr-Cas9 work by using a synthetic ribonucleic acid (RNA) to cut into DNA strands and then insert, alter or remove targeted traits. These might, for example, distort the sex-ratio of mosquitoes to effectively wipe out malarial populations.

“The dual use nature of altering and eradicating entire populations is as much a threat to peace and food security as it is a threat to ecosystems,” he said. “Militarisation of gene drive funding may even contravene the Enmod convention against hostile uses of environmental modification technologies.”

https://www.theguardian.com/science/201 ... chnologies




DARPA’s latest endeavor is a tiny robotics challenge

With an invention history that can claim a pioneering role in the development of the internet, Siri, GPS and other world-changing inventions, the U.S. government agency known as DARPA (Defense Advanced Research Projects Agency) has always thought big. Until now, at least. With its new SHRIMP program, DARPA is suddenly thinking very small indeed — and that’s really exciting.

The SHRIMP program — short for SHort-Range Independent Microrobotic Platforms — is an effort to develop new insect-scale robots for operating in environments where much larger robots may be less effective. In the tradition of its DARPA Grand Challenges, the organization is seeking proposals for suitable robots, in this case ones that weigh less than a gram and are smaller than one cubic centimeter. The selected micro-bots will then compete against one another in a “series of Olympic-themed competitions,” including categories like rock piling, steeplechase, vertical ascent, shot put, weightlifting and more.

https://www.digitaltrends.com/cool-tech ... -olympics/


DARPA Is Funding Research Into AI That Can Explain What It’s “Thinking”

These third wave systems would “think” rather than just churn out answers based on whatever datasets they’re fed (and as we’ve seen in the past, these datasets can include the biases of their creators). Ultimately, it is the next step to creating AIs that can reason and engage in abstract thought, which could improve how both the military and everyone else makes use of AI.

https://futurism.com/third-wave-ai-darpa/



Why Universities Need To Prepare Students For The New AI World

Artificial intelligence is increasingly embedded in our consumer and business lives, and it is poised to transform how societies function in the years to come. Yet universities are not adequately preparing students for a changing world. To better prepare students for a changing world, AI needs to be increasingly embedded into higher education.

For students, AI will inevitably impact their careers. Those interested in careers in AI could pursue a wide range of exciting new career possibilities focused on data science, machine learning or advanced statistics. And, even students not focused on AI would benefit from a sound education in artificial intelligence and familiarity with working with machines.

The AI era will inevitably create new job types, ranging from machine regulators to emotion engineers. To succeed, all students will need to understand, at least at a high level, how machines perform. In addition, they should better equip themselves to do what machines cannot do.

https://www.forbes.com/sites/stephanieg ... e00b8d6bc8


World-first quantum computer simulation of chemical bonds using trapped ions

Quantum chemistry expected to be one of the first applications of full-scale quantum computers

An international group of researchers has achieved the world's first multi-qubit demonstration of a quantum chemistry calculation performed on a system of trapped ions, one of the leading hardware platforms in the race to develop a universal quantum computer.

https://www.sciencedaily.com/releases/2 ... 110028.htm


MIT Uses Nanotech to Miniaturize Electronics Into Spray Form

The 'aerosolized electronics' are so small they can be sprayed through the air. MIT researchers say the tiny devices could be used to in oil and gas pipelines or even in the human digestive system to detect problems.

https://www.pcmag.com/news/362652/mit-u ... spray-form



Researchers find quantum 'Maxwell's demon' may give up information to extract work

His team wanted to know if it would be possible to use information to extract work in this way on a quantum scale, too, but not by sorting fast and slow molecules. If a particle is in an excited state, they could extract work by moving it to a ground state. (If it was in a ground state, they wouldn't do anything and wouldn't expend any work).

But they wanted to know what would happen if the quantum particles were in an excited state and a ground state at the same time, analogous to being fast and slow at the same time. In quantum physics, this is known as a superposition.

"Can you get work from information about a superposition of energy states?" Murch asked. "That's what we wanted to find out."

There's a problem, though. On a quantum scale, getting information about particles can be a bit … tricky.

"Every time you measure the system, it changes that system," Murch said. And if they measured the particle to find out exactly what state it was in, it would revert to one of two states: excited, or ground.

This effect is called quantum backaction. To get around it, when looking at the system, researchers (who were the "demons") didn't take a long, hard look at their particle. Instead, they took what was called a "weak observation." It still influenced the state of the superposition, but not enough to move it all the way to an excited state or a ground state; it was still in a superposition of energy states. This observation was enough, though, to allow the researchers track with fairly high accuracy, exactly what superposition the particle was in—and this is important, because the way the work is extracted from the particle depends on what superposition state it is in.

To get information, even using the weak observation method, the researchers still had to take a peek at the particle, which meant they needed light. So they sent some photons in, and observed the photons that came back.

"But the demon misses some photons," Murch said. "It only gets about half.

The other half are lost." But—and this is the key—even though the researchers didn't see the other half of the photons, those photons still interacted with the system, which means they still had an effect on it. The researchers had no way of knowing what that effect was.

They took a weak measurement and got some information, but because of quantum backaction, they might end up knowing less than they did before the measurement. On the balance, that's negative information.
And that's weird.

"Do the rules of thermodynamics for a macroscopic, classical world still apply when we talk about quantum superposition?" Murch asked. "We found that yes, they hold, except there's this weird thing. The information can be negative.

"I think this research highlights how difficult it is to build a quantum computer," Murch said.

"For a normal computer, it just gets hot and we need to cool it. In the quantum computer you are always at risk of losing information."


Read more at: https://phys.org/news/2018-07-quantum-m ... n.html#jCp


Virtual Reality Has Reached A “Tipping Point.” It’s Officially Here to Stay.

Thanks to virtual reality, you can swim with the dolphins, play some tennis, or spend some alone time, all from the comfort of your own living room. But it’s not yet perfect — a horrible wave of nausea can hit anytime, right in the middle of these activities.

“VR isn’t where I want it to be, but this current generation of products — I think it’s proved that VR is real,” David Ewalt told Futurism. Ewalt is a writer and journalist who focuses on new technology. His book on virtual reality, “Defying Reality” was published on July 17.

“It’s not hype anymore,” he added. “It needs to get much better, but I think we reached that tipping point where you can try the products we have now and say, ‘damn, that really works. VR is real.’”

https://futurism.com/virtual-reality-tipping-point/




To be continued...
#14935558
Google’s Selfish Ledger is an unsettling vision of Silicon Valley social engineering

This internal video from 2016 shows a Google concept for how total data collection could reshape society





#14938119
Technophiles rejoice, a new drug is on the way!

Magic Leap Headset Test Drive: Off Your Phone and Into Your World

There he is, the size of a Candy Land piece, right on the ottoman in front of me: teeny, tiny LeBron James. He jets down the Golden State Warriors’ court—sitting flush on the chocolate leather—and dunks in a hoop the size of my wedding band.

No, I haven’t had a psychedelic sandwich for lunch. I’ve just been wearing what looks like a pair of oversize swim goggles, attached to a Discman thingy on my hip—the Magic Leap One Creator Edition.

These augmented-reality goggles put virtual objects in the real world, unlike virtual-reality goggles, which block it out. Think “Pokémon Go” but far more realistic and potentially useful.

If you haven’t been following Silicon Valley’s mounting interest in AR, it’s time. Microsoft ’s HoloLens headset is starting to pick up steam in enterprise applications. Apple has big plans in the space. And Magic Leap, while a no-name to most, has received nutty amounts of cash, and a lot of buzz in the tech community. Since 2011, the company has raised over $2.3 billion dollars—including funds from Google and AT&T —on the promise of its mysterious “Lightfield” technology.

https://www.wsj.com/articles/magic-leap ... 1533730080


Technology is the opium of the masses.

There is a law of return in Israel which allows a[…]

What's the :?: for? The Eurozone consists of 19 EU[…]

Anyway, one of Xi Jingping's main goals is to &qu[…]

Your half-assed impression of me still sounds smar[…]