"So much joy I cry, so much pain I laugh."
The ink of the scholar is more sacred than the blood of the martyr.
Remember, you need more than one note to make beautiful music.
Love is the missing link!
Wandering the information superhighway, he came upon the last refuge of civilization, PoFo, the only forum on the internet ...
Billionaire ex-Facebook president Sean Parker unloads on Mark Zuckerberg and admits he helped build a monsterI'd like to point out- changing or abandoning social medias advertising models will not stop the beast... The medium is the message.
Sean Parker, the first president of Facebook, has a disturbing warning about the social network: "God only knows what it's doing to our children's brains."
Speaking to the news website Axios, the entrepreneur and executive talked openly about what he perceives as the dangers of social media and how it exploits human "vulnerability."
"The thought process that went into building these applications, Facebook being the first of them ... was all about: 'How do we consume as much of your time and conscious attention as possible?'" said Parker, who joined Facebook in 2004, when it was less than a year old.
"And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever," he told Axios. "And that's going to get you to contribute more content, and that's going to get you ... more likes and comments."
Parker added: "It's a social-validation feedback loop ... exactly the kind of thing that a hacker like myself would come up with, because you're exploiting a vulnerability in human psychology."
"The inventors, creators — it's me, it's Mark [Zuckerberg], it's Kevin Systrom on Instagram, it's all of these people — understood this consciously," he said. "And we did it anyway."
Facebook did not immediately respond to Business Insider's request for comment.
Some in tech are growing disillusioned — and worried
Parker isn't the only tech figure to express disillusionment and worry by what they helped create. Tristan Harris, a former Google employee, has been outspoken in his criticism of how tech companies' products hijack users' minds.
"If you're an app, how do you keep people hooked? Turn yourself into a slot machine," he wrote in a widely shared Medium post in 2016.
"We need our smartphones, notifications screens and web browsers to be exoskeletons for our minds and interpersonal relationships that put our values, not our impulses, first," he continued. "People's time is valuable. And we should protect it with the same rigor as privacy and other digital rights."
In a recent feature, The Guardian spoke to tech workers and industry figures who have been critical of Silicon Valley business practices.
Loren Brichter, the designer who created the slot-machine-like pull-down-to-refresh mechanism now widely used on smartphones, said, "I've spent many hours and weeks and months and years thinking about whetheranything I've done has made a net positive impact on society or humanity at all."
Brichter added: "Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I'm not saying I'm mature now, but I'm a little bit more mature, and I regret the downsides."
And Roger McNamee, an investor in Facebook and Google, told The Guardian: "The people who run Facebook and Google are good people, whose well-intentioned strategies have led to horrific unintended consequences ... The problem is that there is nothing the companies can do to address the harm unless they abandon their current advertising models."
The comments from Parker and others are further evidence of souring public sentiment about Silicon Valley. Once lauded in utopian terms, companies like Facebook have now come under heavy criticism over their role in the spread of "fake news" and Russian propaganda.
http://www.businessinsider.com/ex-faceb ... ty-2017-11
Why the modern world is bad for your brain
Although we think we’re doing several things at once, multitasking, this is a powerful and diabolical illusion. Earl Miller, a neuroscientist at MIT and one of the world experts on divided attention, says that our brains are “not wired to multitask well… When people think they’re multitasking, they’re actually just switching from one task to another very rapidly. And every time they do, there’s a cognitive cost in doing so.” So we’re not actually keeping a lot of balls in the air like an expert juggler; we’re more like a bad amateur plate spinner, frantically switching from one task to another, ignoring the one that is not right in front of us but worried it will come crashing down any minute. Even though we think we’re getting a lot done, ironically, multitasking makes us demonstrably less efficient.
Multitasking has been found to increase the production of the stress hormone cortisol as well as the fight-or-flight hormone adrenaline, which can overstimulate your brain and cause mental fog or scrambled thinking. Multitasking creates a dopamine-addiction feedback loop, effectively rewarding the brain for losing focus and for constantly searching for external stimulation. To make matters worse, the prefrontal cortex has a novelty bias, meaning that its attention can be easily hijacked by something new – the proverbial shiny objects we use to entice infants, puppies, and kittens. The irony here for those of us who are trying to focus amid competing activities is clear: the very brain region we need to rely on for staying on task is easily distracted. We answer the phone, look up something on the internet, check our email, send an SMS, and each of these things tweaks the novelty- seeking, reward-seeking centres of the brain, causing a burst of endogenous opioids (no wonder it feels so good!), all to the detriment of our staying on task. It is the ultimate empty-caloried brain candy. Instead of reaping the big rewards that come from sustained, focused effort, we instead reap empty rewards from completing a thousand little sugar-coated tasks.
In the old days, if the phone rang and we were busy, we either didn’t answer or we turned the ringer off. When all phones were wired to a wall, there was no expectation of being able to reach us at all times – one might have gone out for a walk or been between places – and so if someone couldn’t reach you (or you didn’t feel like being reached), it was considered normal. Now more people have mobile phones than have toilets. This has created an implicit expectation that you should be able to reach someone when it is convenient for you, regardless of whether it is convenient for them. This expectation is so ingrained that people in meetings routinely answer their mobile phones to say, “I’m sorry, I can’t talk now, I’m in a meeting.” Just a decade or two ago, those same people would have let a landline on their desk go unanswered during a meeting, so different were the expectations for reachability.
Just having the opportunity to multitask is detrimental to cognitive performance. Glenn Wilson, former visiting professor of psychology at Gresham College, London, calls it info-mania. His research found that being in a situation where you are trying to concentrate on a task, and an email is sitting unread in your inbox, can reduce your effective IQ by 10 points. And although people ascribe many benefits to marijuana, including enhanced creativity and reduced pain and stress, it is well documented that its chief ingredient, cannabinol, activates dedicated cannabinol receptors in the brain and interferes profoundly with memory and with our ability to concentrate on several things at once. Wilson showed that the cognitive losses from multitasking are even greater than the cognitive losses from pot‑smoking.
Russ Poldrack, a neuroscientist at Stanford, found that learning information while multitasking causes the new information to go to the wrong part of the brain. If students study and watch TV at the same time, for example, the information from their schoolwork goes into the striatum, a region specialised for storing new procedures and skills, not facts and ideas. Without the distraction of TV, the information goes into the hippocampus, where it is organised and categorised in a variety of ways, making it easier to retrieve. MIT’s Earl Miller adds, “People can’t do [multitasking] very well, and when they say they can, they’re deluding themselves.” And it turns out the brain is very good at this deluding business.
Then there are the metabolic costs that I wrote about earlier. Asking the brain to shift attention from one activity to another causes the prefrontal cortex and striatum to burn up oxygenated glucose, the same fuel they need to stay on task. And the kind of rapid, continual shifting we do with multitasking causes the brain to burn through fuel so quickly that we feel exhausted and disoriented after even a short time. We’ve literally depleted the nutrients in our brain. This leads to compromises in both cognitive and physical performance. Among other things, repeated task switching leads to anxiety, which raises levels of the stress hormone cortisol in the brain, which in turn can lead to aggressive and impulsive behaviour. By contrast, staying on task is controlled by the anterior cingulate and the striatum, and once we engage the central executive mode, staying in that state uses less energy than multitasking and actually reduces the brain’s need for glucose.
To make matters worse, lots of multitasking requires decision-making: Do I answer this text message or ignore it? How do I respond to this? How do I file this email? Do I continue what I’m working on now or take a break? It turns out that decision-making is also very hard on your neural resources and that little decisions appear to take up as much energy as big ones. One of the first things we lose is impulse control. This rapidly spirals into a depleted state in which, after making lots of insignificant decisions, we can end up making truly bad decisions about something important. Why would anyone want to add to their daily weight of information processing by trying to multitask?
In discussing information overload with Fortune 500 leaders, top scientists, writers, students, and small business owners, email comes up again and again as a problem. It’s not a philosophical objection to email itself, it’s the mind-numbing number of emails that come in. When the 10-year-old son of my neuroscience colleague Jeff Mogil (head of the Pain Genetics lab at McGill University) was asked what his father does for a living, he responded, “He answers emails.” Jeff admitted after some thought that it’s not so far from the truth. Workers in government, the arts, and industry report that the sheer volume of email they receive is overwhelming, taking a huge bite out of their day. We feel obliged to answer our emails, but it seems impossible to do so and get anything else done.
Before email, if you wanted to write to someone, you had to invest some effort in it. You’d sit down with pen and paper, or at a typewriter, and carefully compose a message. There wasn’t anything about the medium that lent itself to dashing off quick notes without giving them much thought, partly because of the ritual involved, and the time it took to write a note, find and address an envelope, add postage, and take the letter to a mailbox. Because the very act of writing a note or letter to someone took this many steps, and was spread out over time, we didn’t go to the trouble unless we had something important to say. Because of email’s immediacy, most of us give little thought to typing up any little thing that pops in our heads and hitting the send button. And email doesn’t cost anything.
Sure, there’s the money you paid for your computer and your internet connection, but there is no incremental cost to sending one more email. Compare this with paper letters. Each one incurred the price of the envelope and the postage stamp, and although this doesn’t represent a lot of money, these were in limited supply – if you ran out of them, you’d have to make a special trip to the stationery store and the post office to buy more, so you didn’t use them frivolously. The sheer ease of sending emails has led to a change in manners, a tendency to be less polite about what we ask of others. Many professionals tell a similar story. One said, “A large proportion of emails I receive are from people I barely know asking me to do something for them that is outside what would normally be considered the scope of my work or my relationship with them. Email somehow apparently makes it OK to ask for things they would never ask by phone, in person, or in snail mail.”
There are also important differences between snail mail and email on the receiving end. In the old days, the only mail we got came once a day, which effectively created a cordoned-off section of your day to collect it from the mailbox and sort it. Most importantly, because it took a few days to arrive, there was no expectation that you would act on it immediately. If you were engaged in another activity, you’d simply let the mail sit in the box outside or on your desk until you were ready to deal with it. Now email arrives continuously, and most emails demand some sort of action: Click on this link to see a video of a baby panda, or answer this query from a co-worker, or make plans for lunch with a friend, or delete this email as spam. All this activity gives us a sense that we’re getting things done – and in some cases we are. But we are sacrificing efficiency and deep concentration when we interrupt our priority activities with email.
Until recently, each of the many different modes of communication we used signalled its relevance, importance, and intent. If a loved one communicated with you via a poem or a song, even before the message was apparent, you had a reason to assume something about the nature of the content and its emotional value. If that same loved one communicated instead via a summons, delivered by an officer of the court, you would have expected a different message before even reading the document. Similarly, phone calls were typically used to transact different business from that of telegrams or business letters. The medium was a clue to the message. All of that has changed with email, and this is one of its overlooked disadvantages – because it is used for everything. In the old days, you might sort all of your postal mail into two piles, roughly corresponding to personal letters and bills. If you were a corporate manager with a busy schedule, you might similarly sort your telephone messages for callbacks. But emails are used for all of life’s messages. We compulsively check our email in part because we don’t know whether the next message will be for leisure/amusement, an overdue bill, a “to do”, a query… something you can do now, later, something life-changing, something irrelevant.
This uncertainty wreaks havoc with our rapid perceptual categorisation system, causes stress, and leads to decision overload. Every email requires a decision! Do I respond to it? If so, now or later? How important is it? What will be the social, economic, or job-related consequences if I don’t answer, or if I don’t answer right now?
Now of course email is approaching obsolescence as a communicative medium. Most people under the age of 30 think of email as an outdated mode of communication used only by “old people”. In its place they text, and some still post to Facebook. They attach documents, photos, videos, and links to their text messages and Facebook posts the way people over 30 do with email. Many people under 20 now see Facebook as a medium for the older generation.
For them, texting has become the primary mode of communication. It offers privacy that you don’t get with phone calls, and immediacy you don’t get with email. Crisis hotlines have begun accepting calls from at-risk youth via texting and it allows them two big advantages: they can deal with more than one person at a time, and they can pass the conversation on to an expert, if needed, without interrupting the conversation.
But texting suffers from most of the problems of email and then some. Because it is limited in characters, it discourages thoughtful discussion or any level of detail. And the addictive problems are compounded by texting’s hyperimmediacy. Emails take some time to work their way through the internet and they require that you take the step of explicitly opening them. Text messages magically appear on the screen of your phone and demand immediate attention from you. Add to that the social expectation that an unanswered text feels insulting to the sender, and you’ve got a recipe for addiction: you receive a text, and that activates your novelty centres. You respond and feel rewarded for having completed a task (even though that task was entirely unknown to you 15 seconds earlier). Each of those delivers a shot of dopamine as your limbic system cries out “More! More! Give me more!”
In a famous experiment, my McGill colleagues Peter Milner and James Olds, both neuroscientists, placed a small electrode in the brains of rats, in a small structure of the limbic system called the nucleus accumbens. This structure regulates dopamine production and is the region that “lights up” when gamblers win a bet, drug addicts take cocaine, or people have orgasms – Olds and Milner called it the pleasure centre. A lever in the cage allowed the rats to send a small electrical signal directly to their nucleus accumbens. Do you think they liked it? Boy how they did! They liked it so much that they did nothing else. They forgot all about eating and sleeping. Long after they were hungry, they ignored tasty food if they had a chance to press that little chrome bar; they even ignored the opportunity for sex. The rats just pressed the lever over and over again, until they died of starvation and exhaustion. Does that remind you of anything? A 30-year-old man died in Guangzhou (China) after playing video games continuously for three days. Another man died in Daegu (Korea) after playing video games almost continuously for 50 hours, stopped only by his going into cardiac arrest.
Each time we dispatch an email in one way or another, we feel a sense of accomplishment, and our brain gets a dollop of reward hormones telling us we accomplished something. Each time we check a Twitter feed or Facebook update, we encounter something novel and feel more connected socially (in a kind of weird, impersonal cyber way) and get another dollop of reward hormones. But remember, it is the dumb, novelty-seeking portion of the brain driving the limbic system that induces this feeling of pleasure, not the planning, scheduling, higher-level thought centres in the prefrontal cortex. Make no mistake: email-, Facebook- and Twitter-checking constitute a neural addiction.
https://www.theguardian.com/science/201 ... n-overload
Nissan’s Brain-To-Vehicle Interface Could Make Driving Safer by Scanning Your Brain
The automaker announced at beginning of January 2018 that it is developing a Brain-to-Vehicle (B2V) interface that, if implemented, would increase a driver’s reaction times to make driving safer.
This human driver—semi-autonomous collaboration would see the latter predicting the former’s actions — be it turning the steering wheel or applying the brakes — by reading and interpreting their brain signals using Electroencephalography (EEG) technology. Upon doing so, the semi-autonomous vehicle would start those actions 0.2 to 0.5 seconds sooner. The automaker calls it, “Nissan Intelligent Mobility.” When it autonomous mode, the system could also adjust detect driver discomfort and adjust its driving style accordingly, or use augmented reality to alter what the driver sees.
https://futurism.com/nissans-brain-vehi ... ur-brains/
GM Will Launch Robocars Without Steering Wheels Next Year
After more than a century making vehicles for humans to drive, General Motors has ripped the heart out of its latest ride, and is now holding the grisly spectacle up for all the world to see: A car with no steering wheel. And it plans to put a fleet of these newfangled things to work in a taxi-like service, somewhere in the US, next year.
And no, this robo-chariot, a modified all-electric Chevrolet Bolt, doesn't have pedals either. This is GM's truly driverless debut, a car that will have to handle the world on its own. No matter what happens, you, dear human passenger, cannot help it now.
Terrifying? Maybe. But it's also a major step in GM’s aggressive bid to maintain its big dog status as the auto industry evolves away from individual ownership and flesh-and-blood drivers. And it’s just the beginning for the Detroit stalwart. “We’ve put together four generations of autonomous vehicles over the course of 18 months,” says Dan Ammann, GM’s president. “You can safely assume that the fourth generation won’t be the last.”
https://www.wired.com/story/gm-cruise-s ... unch-2019/
2018: The Year Blockchain, AI and IoT Converge,
But will it be centralized?
Decentralization, by its very nature, requires that more intelligence shifts to nodes instead of residing in one central server.
We will continue to see the development of semiconductors that are capable of advanced computing in smaller and smaller devices. As devices at the edge become smarter, the smart contracts enabled by blockchain platforms will work better with more advanced data analytics capabilities.
I see a mini-brain in each of our devices, ranging from simplistic ones to ones capable of processing larger datasets and making decisions based on that data.
The open availability of more data and smarter processing at the nodes will enable broader datasets available to more companies and people, instead of proprietary data ownership that currently exists within companies such as Facebook and Google. More importantly, that data will be diverse and representative of the world we live in, instead of being filtered by a few companies that reside in one geography.
While this may not all happen within the next year, we have started an inevitable march towards that future, one that will be even more transformative than the internet was.
https://www.coindesk.com/2018-year-bloc ... -converge/
Doctors will have to be more like actors as A.I. gains momentum:
Zocdoc CEO Oliver Kharraz fully expects machine learning to take over many clinical functions. For doctors, that means emotional intelligence is going to become increasingly important.
"Doctors in the future will come from the same pool as actors," said Kharraz, in an interview on Thursday at the J.P. Morgan Healthcare conference in San Francisco. Zocdoc's web-based software lets consumers book medical appointments online.
Kharraz, who is also a doctor by training, said there's going to be a shift in the type of personalities attracted to medicine as machines start doing things like diagnosing medical conditions by analyzing scans.
To be successful, doctors are going to need empathy and an ability to listen.
https://www.cnbc.com/2018/01/11/zocdoc- ... er-eq.html
Amazon has created a new computing platform that will future-proof your home
Amazon is in a better position than any other company to dominate ambient computing, the concept that everything in your life is computerized and intelligent.
But I think it's time to add one more category to the list: ambient computing, or the concept that there can be a layer of intelligence powering everything in your home from your lights to your thermostat. Many see this as a new phase of computing where our technology works for us automatically. We're in the early days of ambient computing, but there's already a clear front-runner powering its future: Amazon Alexa.
Read more: http://www.businessinsider.com/amazon-a ... ome-2018-1
Amazon Officially Eliminates Cashiers...
Do you own a grocery store? You better pay attention. Same for you, merchants. As consumers get more and more used to self-service automated stores you’re going to need to respond. But don’t fire your employees yet – people still enjoy engaging with humans, so maybe you can figure out a balance between technology and human interaction.
Read more: https://www.forbes.com/sites/quickerbet ... d1e13c64f2
Intel’s quantum computing efforts take a major step forward
All of these efforts then, are baby steps along the way to true quantum computing. Intel isn't the only one pursuing this goal -- IBM happens to have a giant 50-qubit quantum computer hanging around at CES -- and the competition among tech giants to own this next generation of computing can only make it come more quickly.
Read more: https://www.engadget.com/2018/01/08/int ... p-forward/
5G phones expected in 2019 thanks to Chinese, Qualcomm pact
The reality of a 5G phone is closer than you think.
Mobile chip giant Qualcomm on Thursday announced a partnership with several of the largest Chinese phone manufacturers, including Lenovo (Motorola's parent), Xiaomi, ZTE, Oppo (OnePlus' owner) and Vivo, to build 5G phones as early as 2019.
Under the "5G Pioneer" initiative, Qualcomm will help create a platform for the companies to build phones running on 5G, the next generation of wireless technology that promises more speed and better coverage. 5G, one of the hottest trends in tech, is considered the foundation for a number of growing segments such as self-driving cars and artificial intelligence.
Read more: https://www.cnet.com/news/5g-phones-com ... oppo-vivo/
Meet the newest recruit of Dubai’s police force: Dubai will now police streets with self-driving robo-cars (with facial-recognition tech)
Earlier in June, Dubai Police inducted its first robotic police officer into its ranks. This month, city officials announced that autonomous police cars will begin patrolling the streets of Dubai by the end of the year to help identify and track suspects. Named the O-R3, the patrol car can navigate on its own using machine-learning algorithms and comes with a built-in aerial drone and facial-recognition technology to follow targets (and surveil areas and people) off-road. While the O-R3s will patrol the city on its own and use biometric software to scan individuals it comes across, the driverless car will still be controlled by the police remotely from behind a computer dashboard.
https://www.gqindia.com/content/meet-ne ... ice-force/
Ford wants to patent a driverless police car that ambushes lawbreakers using artificial intelligence
Imagine a police car that issues tickets without even pulling you over.
What if the same car could use artificial intelligence to find good hiding spots to catch traffic violators and identify drivers by scanning license plates, tapping into surveillance cameras and wirelessly accessing government records?
What if a police officer tapping on your car window asking for your license and registration became a relic of transportation’s past?
The details may sound far-fetched, as if they belong in the science-fiction action flick “Demolition Man” or a new dystopian novel inspired by Aldous Huxley’s “Brave New World,” but these scenarios are grounded in a potential reality. They come from a patent developed by Ford and being reviewed by the U.S. government to create autonomous police cars. Ford’s patent application was published this month.
https://www.washingtonpost.com/news/inn ... 9ce1c51e58
I think actual solutions that would address the r[…]
They've forgotten that our ancestors trounced an […]
If no culture is objectively superior or inferio[…]