paserbyp: (Default)
A new study shows that, in tech, over one-third of professionals admit they have issues with depression. Specifically, 38.8 percent of tech pros responding to a Blind survey say they’re depressed. When you tie employers into this, the main offenders are Amazon and Microsoft, where 43.4 percent and 41.58 percent (respectively) of employees say they’re depressed. Intel rounds out the top three with 38.86 percent of its respondents reporting issues with depression.

I’ll point out the top three companies may not be entirely to blame for the depression concerns of tech pros. All three have large footprints in the Pacific Northwest, where the shorter daylight of Fall and Winter contribute to Seasonal Affective Disorder, or SAD. This narrow window of daylight, along with routine overcast or rainy conditions, can throw off a body’s circadian rhythm. Seattle psychiatrist David Avery tells The Seattle Times that less daylight can also affect the brain’s hypothalamus, which directs the body’s release of hormones such as melatonin and cortisol. (It’s worth noting Blind didn’t identify any geographical data about respondents to its depression survey.)

There are similarities between this depression survey and other Blind studies. Over one-third of tech pros report being depressed; over half say their workplace is unhealthy; nearly 60 percent report burnout. An anonymous Dice survey shows most tech pros are dissatisfied with their job enough to consider seeking new employment elsewhere. In other words, across the industry, there’s a strong sense of dissatisfaction amongst tech pros.

At least when it comes to users, tech companies seem to realize their products have an impact on mental health. At WWDC 2018, Apple introduced App Limits, a method to reduce how often you use your phone, particularly the apps on it; it seems to focus in particular on social media, which has been proven many times to directly link to depression.

The upside to this survey is that most tech pros aren’t reporting depression issues. While that’s wonderful, we can’t overlook the nearly 40 percent of tech pros who admit feeling depressed. If you feel similarly, please reach out to a mental health professional for guidance and best practices to deal with your depression the right way.

More details: http://blog.teamblind.com/index.php/2018/12/03/39-percent-of-tech-workers-are-depressed
paserbyp: (Default)
Solar cells are a good way to harvest alternative energy for electricity. But most conventional cells still have trouble converting solar energy when the skies are overcast. Researchers have of course been trying to overcome this problem. In a recent effort out of the University of British Columbia (UBC), for example, a team has used bacteria to convert light to energy to help cells generate electricity even when the sun isn’t shining. UBC’s Department of Chemical and Biological Engineering developed the cell, which is called a “biogenic” cell because it’s made from natural materials.

Solar cells are the part of the solar panel that converts sunlight into an electrical current. Researchers have been experimenting with various materials to develop cells that are more eco-friendly and can work in places known for a lack of sunlight—something with which researchers in British Columbia are familiar.

“Our solution to a uniquely B.C. problem is a significant step toward making solar energy more economical,” said Vikramaditya Yadav, a professor in UBC’s department of chemical and biological engineering, who led the project. Previous efforts to build biogenic solar cells have focused on extracting the natural dye that bacteria use for photosynthesis, Yadav noted. However, this process is costly and complex, uses toxic solvents, and can cause the dye to degrade. To solve this problem, Yadav and his team devised a solution to leave the dye in the bacteria.

Specifically, they genetically engineered E. coli to produce large amounts of lycopene. Lycopene is found in fruits—namely, tomato and watermelon, giving them their color. It has been found to be effective at harvesting light for conversion to energy, researchers said.

The team coated the bacteria with a mineral that could act as a semiconductor and applied the mixture to a glass surface. In this system, the coated glass acts as an anode at one end of the cell, generating current density of 0.686 milliamps per square centimeter. This is an improvement on comparable solutions developed to date, which achieved 0.362 milliamps per square centimeter, Yadav said in the UBC release.

“We recorded the highest current density for a biogenic solar cell,” he said. “These hybrid materials that we are developing can be manufactured economically and sustainably and, with sufficient optimization, could perform at comparable efficiencies as conventional solar cells.”

Moreover, the system worked as efficiently in dim light as in bright light, researchers said. The team published a paper about its work in the journal Small. The researchers see their innovation as a step forward toward broader adoption of solar panels in places where overcast skies are more common than sunny ones, such as British Columbia and northern Europe.

There are both advantages and challenges to the process developed by the team, Yadav acknowledged. One advantage is cost savings, making dye production about one-tenth of what it would be otherwise. A challenge, however, is to find a process that doesn’t kill the bacteria, Yadav said. If this could be done, the dye could be produced indefinitely.

In addition to solar panels, the process the team devised also could be used to develop biogenic materials for mining, deep-sea exploration, and other low-light environments.

More details: https://news.ubc.ca/2018/07/05/bacteria-powered-solar-cell-converts-light-to-energy-even-under-overcast-skies/

and

https://onlinelibrary.wiley.com/doi/full/10.1002/smll.201800729

Sharks

Aug. 7th, 2018 01:41 pm
paserbyp: (Default)
When scientists want to study birds, they have an enormous crowdsourced data set that they can use. When they want to study mammals on land, they can tag them and track them. But what about when scientists want to study marine animals like sharks?

That's a little more complicated. The world's oceans are much bigger than its land mass. Human scientists are land creatures, too, so most of their in-person observations are limited to the surface of the water, and the surface is just a fraction of the actual volume of the ocean which averages 2.3 miles deep. Plus, the salt water of the oceans is pretty much incompatible with electronic equipment for tracking animals' motion and behavior. All this together makes studying sea creatures a lot more difficult than studying other animals. Indeed, it leaves much of sea life a mystery to us.

Maybe that's why the most prevalent characterizations of sharks are those from fiction -- the man-eating monster in Jaws and all its sequels, for instance. Yet scientists have been trying to set the record straight on shark behavior for decades by using actual data instead of the imaginations of Hollywood writers. One of those scientists is Salvador Jorgensen, a senior research scientist at the Monterey Bay Aquarium's Project White Shark. Using a number of innovative "tags" that have been packaged to be impervious to the ocean's salt water, this project collects data about shark movement and behavior and then uses analytics and machine learning to gain a more complete picture of what sharks do. These tags are actually collections of sensors -- among them gyrometers, accelerometers, thermometers, and most recently a camera -- specially packaged for marine environments and designed to be attached to or ingested by a shark.

These devices also include a beacon, and after about a week in the field, which is about how long the battery lasts, these devices are collected and brought back to the lab for data transfer and analysis, Jorgensen told InformationWeek in an interview. Jorgensen said these data logger devices are like Fitbits for sharks. Devices can track the shark movements over the course of several days. Some devices are equipped with cameras and attached to fins to get a shark's-eye view of the ocean. And some devices are packed in blubber and fed to sharks. Jorgensen pioneered this device ingestion technique, which allows the device to monitor how often the shark feeds by recognizing changes in temperature that only happens when ocean water enters the stomach along with the next meal.

"We feed these devices to the shark wrapped in a blubber burrito," Jorgensen said. Like owls, sharks eventually spit out the parts of their meal that are undigestible. After about a week, Jorgensen and his team start to look for the beacons and collect the devices. Jorgensen said that the beacons are equipped with a wet/dry switch that is activated once the device floats up to the surface, breaking the circuit.

Some of the devices do get lost. Some wash up on beaches many miles away and a phone number and cash reward on the packaging prompts the people who find the device to contact the lab and send it back there. Jorgensen said that people are usually pretty excited to hear that the object they found was either attached to or ingested by a shark. Once the device arrives back at the lab, research scientist Jerry Moxley and deep learning engineer Zac Liu work on downloading the data from it via a WiFi connection. The data comes down in a 15 to 25 column data set in a CSV format. Moxley told InformationWeek in an interview that the data is made up of multichannel sensor data that may include data such as depths, accelerometry, video fields, and more. All this is downloaded to a local server at the aquarium and catalogued with many more datasets. Currently the team has hundreds of gigabytes of data stored, but less than a terabyte. The data is organized by deployments, according to Moxley.

Once its downloaded, Moxley works on the data using R, and Zac uses Python to work with machine learning package Keras to operate on the Tensorflow deep learning library. For the compute, the team leans on CyVerse, a supercomputer initiative by the National Science Foundation.

The team is using these advanced technologies to gain more insights into the data they already have. For instance, to save battery life the sensors in the field don't capture a high resolution of data all the time. They may collect a high level of detail for 6 hours and then go back to collecting data at a lower resolution to save on battery life. That's one of the reasons why Jorgensen said that any innovations that increase battery life will probably be the technology to have the greatest impact on shark research. Currently, the team uses machine learning to fill in the blanks of high resolution data during a time when only low-resolution data was collected. Specifically, for example, how often is the shark beating its tale? By using machine learning to extrapolate high resolution data to fill in the blanks of low resolution data, scientists get a better picture of shark energetics and activity and calorie budgets, Moxley said.

What have the scientists learned about shark behavior?

"One of the surprising things we've learned about white sharks is that they don't eat that frequently," said Jorgensen. "We knew from early studies they could go as long as a month. A strategy for white sharks is to have a large meal and then go long periods without foraging. They are not going after everything they see. They are bypassing potential prey all the time."
paserbyp: (Default)


In Science on March 8th, Soroush Vosoughi and his colleagues at the Massachusetts Institute of Technology present evidence that, on Twitter at least, false stories travel faster and farther than true ones. The study, carried out at MIT’s Laboratory for Social Machines, showed this by examining every tweet sent between 2006 and 2017. The researchers used statistical models to classify tweets as false or true, by applying data taken from six independent fact-checking organisations. That allowed them to categorise over 4.5m tweets about 126,000 different stories. Those stories were then ranked according to how they spread among Twitter’s users. The results were stark. False information was retweeted by more people than the true stuff, and faster to boot. True stories took, on average, six times longer than falsehoods to reach at least 1,500 people. Only about 0.1% of true stories were shared by more than 1,000 people, but 1% of false stories managed between 1,000 and 100,000 shares.

The reason false information does better than the true stuff is simple, say the researchers. Things spread through social networks because they are appealing, not because they are true. One way to make news appealing is to make it novel. Sure enough, when the researchers checked how novel a tweet was (by comparing it, statistically, with other tweets) they found false tweets were significantly more novel than the true ones. Untrue stories were also more likely to inspire emotions such as fear, disgust and surprise, whereas genuine ones provoked anticipation, sadness, joy and trust, leading to the rather depressing conclusion that people prefer to share stories that generate strong negative reactions. Perhaps not coincidentally, fake political news was the most likely to go viral. The paper also sheds some of the first peer-reviewed light on the impact of “bots”—automated accounts posing as real people. The idea that Russian bots in particular helped sway America’s presidential election has lodged itself firmly in the public consciousness. Yet the paper finds that, on Twitter at least, the presence of bots does not seem to boost the spread of falsehoods relative to truth.
paserbyp: (Default)
Business PCs went mainstream in the 1990s. At the beginning of the decade, most people didn’t use PCs in offices. By 2000, pretty much all office work involved PCs. The use of mice and keyboards and the necessity of sitting and using a PC all day caused a pandemic of repetitive stress injuries, including carpal tunnel syndrome. It seems as if everybody got injured by their PCs at some point. It was common back then to see people wearing wrist braces. Companies invested in wrist pads, ergonomic mice and keyboards, and special foot rests. Insurance claims for medical treatment for carpal tunnel exploded. Then the 2000s hit. Mobile devices took off. Business technology use was diversified into laptops, BlackBerry pagers, PDAs and cellphones. We stopped hearing about carpal tunnel and starting hearing about “texting thumb” and other repetitive stress injuries related to typing on a phone or pager. Around ten years ago, the technology health problems shifted from the physical to the mental. Employees started suffering from all kinds of psychological syndromes, from nomophobia (fear of being without a phone) to phantom vibration syndrome (where you think you feel your phone vibrating even though your phone isn’t there) to screen insomnia to smartphone addiction. In recent years, our smartphones have begun harming health by giving us social media all day and all night, with notifications and alerts telling us something is happening. Millions of people are now suffering from smartphone addiction, which is really social media addiction, and, as I detailed in this space, it’s harming productivity, health and happiness.

And now management science has identified a collection of problems caused by the accumulated effect of all our technology, called “technostress.”

Technostress is actually not the latest malady in a series of technology-induced syndromes. In fact, it’s an umbrella term that encompasses all negative psychological effects that result from changes in technology.

Nomophobia, phantom vibration syndrome, screen insomnia, smartphone addiction, information overload, facebook fatigue, selfitis (the compulsive need to post selfies), social media distraction and the rest are all covered by the umbrella of “technostress.”

While ergonomics covers the physical effects of technology, technostress covers the mental effects.

Over time, technostress is increasingly related to compulsion. People now feel powerful anxiety when they’re not looking at their phones, fearing unseen important emails and work messages and a general sense of FOMO (fear of missing out) with the social networks.

While connected, people compulsively check all the incoming communications streams and feel compelled to respond. Time seems to stop, and the work hours spent on compulsive messaging and social media is usually considered to take far less time than it actually does.

By the end of the workday, employees are exhausted, feeling that they worked hard all day. But much of that fatigue is caused by the constant mental shifting from one communications medium to the next, and the anxiety and stress are caused by nonstop communication.

A survey of 20,000 European workers conducted by Microsoft and published this week found that technology causes stress, which lowers job satisfaction, organizational commitment and productivity.

Specifically, the survey found, the volume and relentlessness of email, text messages and social media posts distract and distress.

Microsoft makes the very good point that IT leaders readily accept the competitive necessity of digital disruption, as well as the need to do it right. But they also point out that doing it right means not only implementing new ways to work, but also helping employees with the stress of digital disruption.

In the past, employees were able to focus on work while at work and personal lives while not at work. Today, smartphones and communication and social apps keep a constant stream of work and personal messages coming in 24 hours a day, and it’s taking a toll.

Smartphone notifications interrupt, and those red circles with the numbers in them showing waiting messages draw people into those apps to check the messages.

Just a tiny fraction of those surveyed by Microsoft — only 11.4% — said they felt highly productive.

Technology, and the way it’s deployed, is not having the intended effect. It’s causing technostress, and lowering, rather than raising, productivity.

The main solution is a strong digital culture within an enterprise, according to Microsoft.

Surveyed workers employed by companies with a strong digital culture expressed a 22% rate of feeling highly productive, roughly double the average.

Here are examples of good digital culture practices:

* Put limits on email; no sending or replying to email after work hours.

* Measure employee happiness with technology with surveys of your own, and take action on the results.

* Focus on constructing the workday to enable flow, or concentrated deep work.

* Consider banning phones from meetings.

* Train employees on the causes and cures for technostress, including the management of social media usage.

* Encourage staff to take breaks, avoid work after hours and communicate more in person, rather than digitally.

Most importantly, take this seriously. It’s the kind of thing managers, especially in IT, tend to dismiss. (Microsoft’s survey points out that the most technical people are the least likely to suffer from technostress, and may therefore believe it’s not a big problem).

Technostress sounds like a fad disorder, a frothy buzzword without import. In fact, it’s probably the most costly problem in your organization.

Technostress is caused by changes in technology, and the pace of change will keep accelerating. Artificial intelligence, data analytics, robotics, the internet of things, virtual reality, augmented and artificial reality — these changes will bring technostress to a whole new level.
paserbyp: (Default)


New data suggests that tech skills such as network analysis, computer vision, Chef.io, and neural networks are worth anywhere from $140 to $200 per hour on the open market. What other skills earned over $100 per hour? Firmware engineering and hardware prototyping hit $130 per hour, while cloud computing averaged $125. Spatial analysis and “Apple Watch” (presumably building iOS smartwatch apps) pulled down $110 per hour, as did NetSuite development. Algorithm development and software debugging were worth a cool $100 per hour.

Obviously, not all freelancers (and gigs) are created equal, and there’s no guarantee that someone with these skills will earn these amounts on the open market. That being said, there are some easily discernable trends behind these freelancer payouts; for example, the high rates paid to those specializing in computer vision suggests there’s a serious market for machine learning and artificial intelligence (A.I.), of which computer vision is a pretty significant building block.

In similar fashion, interest in spatial analysis suggests companies are exploring things such as mapping spaces—potentially vital for everything from self-driving cars to commerce. But for those who don’t specialize in a cutting-edge skill, the good news is that more “standard issue” skills such as debugging and algorithm building can still earn tech pros quite a bit of cash.

While tech freelancing is potentially lucrative, it’s also hard work. Freelancers need to sell themselves, and focus on building up a stable roster of clients who offer repeat business. It’s not for everyone, especially those who dislike the prospect of unsteady income and (occasionally) annoying clients. But for anyone with the right skills and attitude, it can more than pay the bills.
paserbyp: (Default)
Принцип Питера был выдвинут канадским специалистом в области образования и, как он называл себя сам, "иерархологом" Лоренсом Питером и канадским же писателем Реймондом Халлом в их бессмертном труде "Принцип Питера", который увидел свет в 1969 году. Принцип формулируется так: "В иерархической системе любой работник поднимается до уровня своей некомпетентности" — то есть любой работник будет расти до тех пор, пока не займет места, на котором он окажется не в состоянии выполнять свои обязанности. Предполагается, что на этом месте он останется настолько долго, насколько ему позволят собственные силы и здоровье.

В феврале 2010 года, итальянским ученым Алессандро Плукино, Андреа Раписарде и Чезаре Гарофало из Университета Катании, опубликовали в журнале "Physica A" труд "Возвращаясь к принципу Питера: расчетное изучение", где они взяли принцип Питера за основу и добавили гипотезу, что если организация или предприятие работает в соответствии с принципом Питера, то постоянно теряет в эффективности. И проверили несколько математических моделей повышения эффективности работы организаций. Они пришли к выводу, что наиболее успешным способом повышения эффективности работы может быть только совершенное разупорядочивание назначений и продвижения по карьерной лестнице.
paserbyp: (Default)
What is going on at Sun Microsystems Laboratories in Burlington, Mass?

Vice President and Sun Fellow Robert Sproull says the Labs employs 150 and gets 2% of Sun's roughly $2 billion in R&D money per year. Of all the projects underway, 60% to 70% are software development efforts, he estimates.

One of new technology the company demonstrated was an audioconferencing tool built in Java that has some interesting features. For example, users can start a private voice-chat session in the background, and adjust the audio level for any individual participant. Users also can migrate the conference call to a cell phone if they have to hit the road. All sensible advances.

Principal Investigator Nicole Yankelovich says she hopes some pieces of the project end up in production, but she won't venture a guess as to where and how. That's life as a research scientist.

Profile

paserbyp: (Default)
paserbyp

May 2025

S M T W T F S
    1 23
456 78910
11 1213 14 151617
18 19 20 21222324
25262728293031

Most Popular Tags

Syndicate

RSS Atom

Style Credit

Page generated May. 22nd, 2025 06:54 am
Powered by Dreamwidth Studios
OSZAR »