About 10 years ago, in room H33 of Harvard University's Kirkland House, one 19-year-old launched "thefacebook.com." Today that 19-year-old is 32 years old and sits at the helm of a $340 billion social networking empire. Facebook 's ability to give every private citizen a public identity revolutionized the way we share our lives and raise awareness about our ideas. It created a breeding ground for entrepreneurs like Evan Spiegel, the Snapchat founder and CEO, who is worth upward of $2.1 billion at the young age of 26. From Zuckerberg we learned that goals deemed unachievable — like bringing back the nostalgia of Pokemon in an app that shattered the social sphere — thrive within the tech ecosystem. Today the Apple App Store, home to many of these popular platforms, grows by more than 1,000 apps every day, according to the International Business Times . So what's the by-product of all this noise? A community of new age entrepreneurs who are rapidly swiping left or right and filtering photos to grow their business. But before they began hashtagging, many of these self-made social media mogul's were just trying to learn the tools of the trade from leaders like Zuckerberg, who turned Silicon Valley and everything that came out of it on its head. Here are three relevant tips he gives to all new age entrepreneurs looking to disturb this racket even further: 1. Explore before you commit In a 2012 talk at Y Combinator Startup School, Zuckerberg stressed to founder Paul Graham that entrepreneurs need to give themselves more flexibility in each pursuit. "You can definitely do that in the framework of a company, but you have to be weary of working at a company and getting locked in," he said. In his first letter to shareholders, Zuckerberg explained that Facebook was never meant to be a company and that at first it was just a hobby of his. Over time it grew into a business. If you want to be an entrepreneur, it's fine to have ideas that will resolve small problems, but Zuckerberg believes that if you have an idea, it should marry big social impact. He told the group of young guns that in order to have big impact, "you're going to change what you do," encouraging them to explore and determine what they don't enjoy so that they can commit to what they do enjoy when they find it. Moral: Open yourself up to learning new things, and follow only what you love. 2. Don't try to be superhuman Mistakes are a good, very necessary part of being an entrepreneur. During a live QA, a shy eighth-grader once asked Zuckerberg how he overcame challenges, such as finding lead investors and creating hype among users, during the early days of Facebook. "No person knows how to deal with everything. But if you can find a team of people, or friends, or family … then that's what's really going to get you through," he answered. Many want-to-be entrepreneurs are afraid to take risk and make mistakes, but Zuckerberg believes that having a strong support system and a team that shares your vision allows for room to appreciate mistakes for what they're worth. "You don't have to be superhuman; you have to just kind of keep on going." Moral: It's a long journey, but you don't have to go alone. Find people who share your passion. 3. Done is better than perfect In Menlo Park, California, where Facebook HQ resides, the phrase "Done is better than perfect" is painted on the inner walls as a social mantra for the company. In other words, producing, gauging a reaction and improving should be considered a badge of honor. After Facebook filed its S-1 to IPO at the beginning of 2012, Zuckerberg explained the philosophy behind this mantra as "The Hacker Way." He said, "Instead of debating for days whether a new idea is possible or what the best way to build something is, hackers would rather just prototype something and see what works." By doing so, Facebook embraces an optimistic culture that allows them to test boundaries and say, "This can be better." Moral: Rather than trying to get everything perfect all at once, seek to continually improve and test boundaries.
Was The Department Of Defense Behind Facebook’s Controversial Manipulation Study? Submitted by Tyler Durden on 07/01/2014 23:05 -0400 Reality Twitter Twitter University of California lang: en_U in Share 28 Submitted by Michael Krieger of Liberty Blitzkrieg blog , I’ve spent pretty much all day reading as much as possible about the extremely controversial Facebook “emotional contagion” study in which the company intentionally altered itsnews feed algorithm to see if it could manipulate its users’ emotions. In case you weren’t aware, Facebook is always altering your news feed under the assumption that there’s no way they could fill your feed with all of your “friends’” pointless, self-absorbed, dull updates (there’s just too much garbage). As such, Facebook filters your news feed all the time, something which advertisers must find particularly convenient. In any event, the particular alteration under question occurred duringone week in January 2012, and the company filled some people’s feeds with positive posts, while others were fed more negative posts. Once the data was compiled, academics from the University of California, San Francisco and Cornell University were brought in to analyzethe results. Their findings were thenpublished in the prestigiousProceedings of the National Academy of Sciences. They found that: For people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred. These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks. You probably know most of this already, but here is where it starts to get really strange. Initially, the press releasefrom Cornell highlighting the study said at the bottom: “The study was funded in part by the James S. McDonnell Foundation and the Army Research Office.” Once people started asking questions about this, Cornell claimed it had made a mistake, and that there was no outside funding. Jay Rosen, Journalism Professor at NYU, seems to find this highly questionable. He wrote on his Facebook page that: Strange little turn in the story of the Facebook “emotional contagion” study. Last month’s press release from Cornell highlighting the study had said at the bottom: “The study was funded in part by the James S. McDonnell Foundation and the Army Research Office.” Why would the military be interested? I wanted to know. So I asked Adam D.I. Kramer, the Facebook researcher, that question on his Facebook page, where he has posted what he called a public explanation. (He didn’t reply to my or anyone else’s questions.) See: https://www.facebook.com/akramer/posts/10152987150867796 Now it turns out Cornell was wrong! Or it says it was wrong. The press release now reads: “Correction: An earlier version of this story reported that the study was funded in part by the James S. McDonnell Foundation and the Army Research Office. In fact, the study received no external funding.” Why do I call this strange? Any time my work has been featured in an NYU press release, the PR officers involved show me drafts and coordinate closely with me, for the simple reason that they don’t want to mischaracterize scholarly work. So now we have to believe that Cornell’s Professor of Communication and Information Science, Jeffrey Hancock, wasn’t shown or didn’t read the press release in which he is quoted about the study’s results (weird) or he did read it but somehow failed to notice that it said his study was funded by the Army when it actually wasn’t (weirder). I think I would notice if my university was falsely telling the world that my research was partially funded by the Pentagon… but, hey, maybe there’s an innocent and boring explanation that I am overlooking. It gets even more interesting from here. TheProfessor of Communication and Information Science, Jeffrey Hancock, who Mr. Rosen mentions above, has a history of working with the U.S. military, specifically the Minerva Institute. In case you forgot what this is, the Guardian reported on it earlier this year. It explained: A US Department of Defense (DoD) research program is funding universities to model the dynamics, risks and tipping points for large-scale civil unrest across the world, under the supervision of various US military agencies. The multi-million dollar program is designed to develop immediate and long-term “warfighter-relevant insights” for senior officials and decision makers in “the defense policy community,” and to inform policy implemented by “combatant commands.” Launched in 2008 – the year of the global banking crisis – the DoD ‘Minerva Research Initiative’ partners with universities “to improve DoD’s basic understanding of the social, cultural, behavioral, and political forces that shape regions of the world of strategic importance to the US.” SCG News has written one of the best articles I have seen yet on the links between the Facebook study and the Department of Defense. It notes: In the official credits for the study conducted by Facebook you’ll find Jeffrey T. Hancock from Cornell University. If you go to the Minerva initiative website you’ll find that Jeffery Hancock received funding from the Department of Defense for a study called“Cornell: Modeling Discourse and Social Dynamics in Authoritarian Regimes”. If you go to the project site for that study you’ll find a visualization program that models the spread of beliefs and disease. Cornell University is currently being funded for another DoD study right now called“Cornell: Tracking Critical-Mass Outbreaks in Social Contagions”(you’ll find the description for this project on the Minerva Initiative’s funding page). So I went ahead and looked at the study mentioned above, and sure enough I found this: There he is, Jeff Hancock, the same guy who analyzed the Facebook data for Cornell, which initially claimed funding from the Pentagon and then denied it. I call bull*****. Stinking bull*****. So it seems that Facebook and the U.S. military are likely working together to study civil unrest and work on ways to manipulate the masses into apathy or misguided feelings of contentment in the face of continued banker and oligarch theft. This is extremely disturbing, but this whole affair is highly troubling in spite of this. For one thing, although governments and universities need to take certain precautions when conducting such “research,” private companies like Facebook apparently do not. Rather, allthey have to do is get people to click “I accept” to a terms of service agreement they never read, which allows companies to do almost anything they want to you, your data and your emotions. What we basically need to do as a society is completely update our laws. For starters, if a private corporation is going to lets say totally violate your most basiccivil liberties as defined under the Bill of Rights, a simple terms of service agreement should not be sufficient. For more invasive violations ofsuchrights, perhaps a one page simple-to-read document explaining clearly whichof your basic civil libertiesyou are giving away should be mandatory. For example, had Facebook not partnered at the university level to analyze thisdata, we wouldn’t even know this happened at all. So what sort of invasive, mind-****ing behavior do you think all these large corporations with access to your personal data are up to. Every. Single. Day. The Faculty Lounge blog put it perfectly when it stated: Academic researchers’ status as academics already makes it more burdensome for them to engage in exactly the same kinds of studies that corporations like Facebook can engage in at will. If, on top of that, IRBs didn’t recognize our society’s shifting expectations of privacy (and manipulation) and incorporate those evolving expectations into their minimal risk analysis, that would make academic research still harder, and would only serve to help ensure that those who are most likely to study the effects of a manipulative practice and share those results with the rest of us have reduced incentives to do so. Would we have ever known the extent to which Facebook manipulates its News Feed algorithms had Facebook not collaborated with academics incentivized to publish their findings? We can certainly have a conversation about the appropriateness of Facebook-like manipulations, data mining, and other 21st-century practices. But so long as we allow private entities freely to engage in these practices, we ought not unduly restrain academics trying to determine their effects. Recall those fear appeals I mentioned above. As one social psychology doctoral candidate noted on Twitter, IRBs make it impossible to study the effects of appeals that carry the same intensity of fear as real-world appeals to which people are exposed routinely, and on a mass scale, with unknown consequences. That doesn’t make a lot of sense. What corporations can do at will to serve their bottom line, and non-profits can do to serve their cause, we shouldn’t make (even) harder—or impossible—for those seeking to produce generalizable knowledge to do. If you read Liberty Blitzkrieg, you know I strongly dislike Facebook as a company. However, this is much bigger thanjust one experiment by Facebook with what appears to be military ties. What this is really about is thefrightening reality that these sorts of things are happening every single day, and we have no idea it’s happening. We need to draw the lines as far as to what extentwe as a society wish to be data-mined and experimented onby corporations with access to all of our private data. Until we do this, we will continue to be violated and manipulated at will. For some of my Facebook critical articles from earlier this year, read: The Chief Operating Officer of Facebook Wants to Ban the Word “Bossy” How UK Prime Minister David Cameron Paid Thousands of Dollars for Facebook “Likes” How Facebook Exploits Underage Girls in its Quest for Ad Revenue This Man’s $600,000 Facebook Disaster is a Warning For All Small Businesses Average: 4.92 Your rating: None Average: 4.9 ( 25 votes)
Some of Facebook Inc.'s biggest investors now plan to cash out as much as half of their stakes in the social network's initial public offering this week. Facebook said Wednesday it will boost the size of its IPO by 25%, or about 100 million shares, as early investors sell as much as $3.8 billion in additional shares. Goldman Sachs Group Inc., Tiger Global Management and Facebook director Peter Thiel—who was one of the social network's first investors—more than doubled the amount of stock they plan to sell.