Page 2,046«..1020..2,0452,0462,0472,048..2,0602,070..»

Do These 6 Things to Move From Mediocrity to Success – PayScale Career News (blog)

Posted: August 1, 2017 at 1:42 am


Whatever your personal definition of success, you know when youre not reaching it. You just feel blah. Uninspired. Unfulfilled.

So how do you go from being mediocre to successful, whatever your definition of that may be?

Start making higher-quality choices. A recent CNBC article, Why Most People Will Never Be Successful, asserted that success is continuously improving who you are, how you live, how you serve and how you relate.

That means focusing on those few things which matter most, Benjamin Hardy writes in the article. Hardy challenges, If your daily behaviors are consistently low quality, what do you expect your lifes output to be?

When you focus on self-improvement, educating yourself and making higher quality choices, you will see your personal success meter pointing away from mediocrity. We all have to indulge once in a while in a little mindless YouTube watching or Facebook scrolling, but start paying more attention to how consistently youre making choices that feed your brain, body and soul and your motivation to crave something better in life.

So, before you pick up the sugar- and caffeine-laden energy drink, perhaps pick up a glass of water. Consider turning off The Bachelorette and tuning in to Frontline, and putting down People magazine and picking up the newspaper.

While it may seem counter-intuitive, focusing on being successful can actually drain you. Because youre always trying to achieve this goal or that goal, you dont enjoy the journey. Reframe how you view your work, by focusing on what you enjoy about it, not whether it won an award or even a pat on your back from the boss. By focusing on your craft, youll find more fulfillment on a daily basis.

Its easy to fall into patterns and just do what your job requires. But those who are willing to go beyond whats expected are the ones who excel.

Plan ahead to try to anticipate what might be needed, what might be next steps, and what obstacles might arise. Successful people get noticed for thinking ahead.

Theres a reason everyones heard the saying, if you dont have your health, you dont have anything theres truth to it. If your health suffers, so will your success. According to Fast Company, practicing self-care is one of the six things the most productive people do every day. This can be in the form of a cardio workout, a light lunch-hour stroll or nightly meditation.

And get enough sleep. Arianna Huffington, Bill Gates and Jeff Bezos get seven hours of sleep, according to a list of successful peoples sleep habits. Sure, some highly successful people have achieved much on far less sleep, but scientific evidence continues to mount that skipping sleep is actually the most counterproductive decision you can make.

Some of todays top entrepreneurs schedule time to think. Why? Because to stay ahead of the competition, you need new ideas. Jeff Weiner, CEO of LinkedIn, schedules two hours of thinking time every day. And several high-profile CEOs in recent years have rejected the idea of constant busyness, touting the importance of critical thinking in an ever-changing digital economy.

Because everyones definition of success can be different, its important that you are clear on what success means to you and what you need to do to get there. Then create the infrastructure to stay focused and to keep progressing. One way to do this is to build your own board of directors, four or five individuals who can serve as consultants in regard to work matters. When you own your career, you take responsibility for your success.

How do you challenge yourself to avoid being mediocre? We want to hear from you. Tell us your thoughts in the comments or join the conversation on Twitter.

boost productivity tips for success work motivation

Read more from the original source:
Do These 6 Things to Move From Mediocrity to Success - PayScale Career News (blog)

Written by grays |

August 1st, 2017 at 1:42 am

Posted in Personal Success

Well-known travel writer and ex-Sony marketer arrested for indecent assault – South China Morning Post

Posted: at 1:42 am


A well-known travel writer and prominent former marketer for Sony video game products was arrested after a man accused him of indecent assault during a job interview.

James Hong Ming-sang, 47, former director of Sony Computer Entertainment Hong Kong, allegedly touched the 22-year-old male interviewees private parts in a North Point office on July 25.

The Sichuan-born author of more than 20 titles was also said to have kissed the victim without his consent.

The student made a police report at North Point Police Station the next day.

Hong was arrested on Monday. He was later released and must report back later this month. An Eastern District investigation team is on the case.

Hong was known to the Hong Kong public for his promotion of the PlayStation video game console. He quit Sony in 2013.

Hong has hosted TV and radio travel programmes and has been writing travel columns. He is also known for his personal success story.

Hong started out as a 16-year-old newcomer to Hong Kong from Sichuan province with just HK$2 in his pocket.

He entered Form Four at a private school but had a difficult time adjusting since he knew very little English and Cantonese. He put in extra effort to catch up and eventually graduated from the University of Hong Kong with a psychology and philosophy degree.

Hongs Japanese language ability helped earn him a position at Sony Computer Entertainment in 1998. Over 13 years, he worked his way up from marketing officer to general manager of the marketing division and was regarded as having played a significant part in the local success of the PlayStation.

In 2011, he became the companys first Chinese director.

He published his first travel book in 2010, raising HK$500,000 for charity.

Hongs last public appearance was on July 24 at the annual Hong Kong Book Fair, where he promoted two newly published books.

Read the rest here:
Well-known travel writer and ex-Sony marketer arrested for indecent assault - South China Morning Post

Written by grays |

August 1st, 2017 at 1:42 am

Posted in Personal Success

Anti-union nonprofit finds success undercutting Big Labor – Washington Examiner

Posted: at 1:42 am


The most effective solutions to complicated problems can sometimes be the simplest. Case in point, one conservative group's strategy to undermine Big Labor is as straightforward as informing certain union members that they do not have to pay dues.

The impetus for this effort came from the Supreme Court's decision in Harris v. Quinn. The ruling forbade a common but very shady arrangement by which several Democratic governors had forced more than 100,000 homecare workers nationwide into unions. States like Illinois, Michigan, Oregon and others invented the fiction that these workers were government employees on the basis that their clients (usually chronically ill family members) were using Medicaid money to pay for their care. This scheme was intended to allow the unions (specifically the Service Employees International Union) to skim from benefits intended for the sick and poor.

Under the Supreme Court's ruling in Harris, homecare workers in this situation are free to opt out of union membership and pay no dues, because they aren't true government employees. And many of them already taken advantage of the decision and quit their union, but many more remain unaware that they have this option, because it isn't something the union especially wants them to know about.

Enter the Freedom Foundation, a nonprofit based in the Pacific Northwest. It recently announced that its efforts have resulted in an estimated 10,000 workers withholding dues, "costing the unions and Democratic candidates over $10 million to date."

"By educating union members about their right to stop paying union dues, Freedom Foundation defunds Big Labor," the organization says in a promotional video released Monday. "That means more money for the workers and less money for liberal politicians."

On Friday, the foundation also revealed new data from a public records request about the work it has been doing along these lines in the Oregon market. It turns out that 11,399 of the 28,667 homecare and personal support workers who had been forced into the union have quit SEIU 503 in the last two years, the very workers to which the Freedom Foundation had been reaching out to in its information campaign. The consequential drop in dues that unions suffer as a result of these campaigns cuts into their cash supply -- a pot that's used to fund the campaigns of Democratic candidates around the country, despite many union members being conservative.

According to the Center for Responsive Politics, the SEIU donated $1,461,756 to congressional candidates in the 2016 cycle, all of which went to Democrats. Meanwhile, an AFL-CIO exit poll found 37 percent of union members voted for Donald Trump over Hillary Clinton last November.

It's only logical then to assume the Foundation's efforts will result in fewer union members and less money flowing to Democratic candidates as right-leaning workers are informed of their rights, a venture that could impact campaigns on both the local and national levels in future years.

Emily Jashinsky is a commentary writer for the Washington Examiner.

Link:
Anti-union nonprofit finds success undercutting Big Labor - Washington Examiner

Written by admin |

August 1st, 2017 at 1:42 am

Posted in Personal Success

Enlightenment ESO Academy

Posted: at 1:41 am


What is Enlightenment?

Enlightenment in Elder Scrolls Online is basically a reduction in the EXP required to earn a Champion Point in the Champion Systemwhich occurs every 24 hours. Enlightenment was put in place to allow players who have less time to invest in ESO a chance to catch up and remain competitive.

A normal Champion Point is earned for every 400,000 EXP that you gain.

If you are Enlightened it will only take 100,000 EXP to earn a Champion Point.

Once you unlock the Champion System, when one of your characters reaches Veteran Rank 1, you will earn Champion Points every time you gain 400,000 EXP. You will also set off an invisible timer which resets every 24 hours. This timer is what gives you Enlightenment. This timer gives out 1 Champion Point worth of Enlightenment every 24 hours and you can accrue a maximum of 12 Champion Points worth of Enlightenment before the timer stops to wait for you to come back.

Every24 hours you get the chance to earn 1 Champion Point that only needs 100,000 EXP.

So you can log on every day and earn 1 Champion Point with 100,000 EXP.Enlightenment also accrues if you dont use it. So say that you miss a few days and log on after a 5 day break. You will now have accrued enough Enlightenment to earn 5 Champion Points that only need 100,000 EXP. Enlightenment accrues up to a maximum of 12 days. So if you come back after a 15 day break you will only have enough Enlightenment to earn 12 Champion Points at 100,000 EXP.

You can tell if you are Enlightened as when you first log on a message will appear saying that you are Enlightened. You can also check by hovering over the Champion Point bar which is just below the main EXP bar. You can open your inventory, or another menu, and you will find these things in the top left corner of the screen.

The whole system can be a little confusing at first.

Here is what ZOS had to say, to try and clarify things a little.

ZOS_GinaBruno

March 19 2015

Hi everyone!

Weve been seeing a lot of confusion over Enlightenment and how the system works, so we wanted to explain it a bit more thoroughly for you. In simplest terms, Enlightenment is a bonus for the XP you earn while playing that counts toward your Champion Point progression.

Every 24 hours, you receive enough Enlightenment to earn one Champion Point at the rate of 100,000XP per Point; after your Enlightenment is used up, you will return to requiring 400,000XP to earn a Champion Point. The 24 hour timer starts when you log in with your first Veteran character or unlock the Champion System, whichever is first, and resets every 24 hours after it first starts. You will receive a message on your screen that you are Enlightened, and you can also hover over your XP bar to see if you are Enlightened.

If you end up not using up all your Enlightenment while you play, it will continue to accrue for a maximum of 12 days. Once you hit the limit, you will not gain any additional Enlightenment until you begin to use it by gaining XP.

Source

See the article here:
Enlightenment ESO Academy

Written by admin |

August 1st, 2017 at 1:41 am

Posted in Enlightenment

Enlightenment | Dr. Puff | Enlightenment Podcast

Posted: at 1:41 am


The doors to living an enlightened life open with the keys of silence and just being. Leave your thoughts behind and enter.

-Dr. Robert Puff, Meditation Expert

When we wake up to who we are, something happens. We stop identifying with our egoic selves because we realize they are impermanent and only that which is permanent can be who we are.

We arent our bodies, we arent our memories, we arent our thoughts, we arent our feelings We arent any of these things, so we stop identifying with them. What happens is that detachment develops. An aloofness or distancing from everything that occurs. We wake up to the fact that life is an extended dream and a relaxation is able to set in. Its a sense of calm or a feeling that all is well.

We lose our identity with our lives, thoughts and feelings, so we witness them but we dont engage with them. We notice them, but we dont create stories with them. Since we dont create Read More

When I was an undergraduate at university many years ago, my deep enjoyment and love for the works of William Shakespeare blossomed. I had the privilege of taking a Shakespearean class and then during one summer in my undergraduate years, I was able to travel through Europe inexpensively on a bike and a Europass to see the great sites. A memory I remember most is going to Stratford-upon-Avon and watching a William Shakespeare play. I dont know where my passion and love for his plays comes from but it has been a deep part of my life. His writings have also taught me many things.

When I was in England many years ago for the first time, I was standing in the back of the audience watching the play As You Like It that was performed not too far from the Read More

The Most Powerful Mantra: Learn to Lose the Ego and Awaken to Who We Are

What Can Help Me Achieve Enlightenment? Begin Your Journey with the Right Tools

There are three big steps towards spiritual enlightenment. Anyone of us can take these steps but they are crucial aspects in moving in the direction of living an awakened life.

The first step we have to take is what I call earnestness. What I mean by this is that in order for us to move in the direction towards enlightenment, you really have to want it. It cant just be one of your many endeavors. You cant say Ill work during the day, sleep at night and during Saturday and Sunday evenings Ill study and work towards enlightenment. This is not going to cut it. In many ways we have to eat, drink and sleep our paths towards enlightenment. Its very crucial that it consumes our lives. Many people have gone down the path towards enlightenment and most have failed. Its Read More

Continue reading here:
Enlightenment | Dr. Puff | Enlightenment Podcast

Written by admin |

August 1st, 2017 at 1:41 am

Posted in Enlightenment

history of Europe – The Enlightenment | Britannica.com

Posted: at 1:41 am


The Enlightenment was both a movement and a state of mind. The term represents a phase in the intellectual history of Europe, but it also serves to define programs of reform in which influential literati, inspired by a common faith in the possibility of a better world, outlined specific targets for criticism and proposals for action. The special significance of the Enlightenment lies in its combination of principle and pragmatism. Consequently, it still engenders controversy about its character and achievements. Two main questions and, relating to each, two schools of thought can be identified. Was the Enlightenment the preserve of an elite, centred on Paris, or a broad current of opinion that the philosophes, to some extent, represented and led? Was it primarily a French movement, having therefore a degree of coherence, or an international phenomenon, having as many facets as there were countries affected? Although most modern interpreters incline to the latter view in both cases, there is still a case for the French emphasis, given the genius of a number of the philosophes and their associates. Unlike other terms applied by historians to describe a phenomenon that they see more clearly than could contemporaries, it was used and cherished by those who believed in the power of mind to liberate and improve. Bernard de Fontenelle, popularizer of the scientific discoveries that contributed to the climate of optimism, wrote in 1702 anticipating a century which will become more enlightened day by day, so that all previous centuries will be lost in darkness by comparison. Reviewing the experience in 1784, Immanuel Kant saw an emancipation from superstition and ignorance as having been the essential characteristic of the Enlightenment.

Before Kants death the spirit of the sicle des Lumires (literally, century of the Enlightened) had been spurned by Romantic idealists, its confidence in mans sense of what was right and good mocked by revolutionary terror and dictatorship, and its rationalism decried as being complacent or downright inhumane. Even its achievements were critically endangered by the militant nationalism of the 19th century. Yet much of the tenor of the Enlightenment did survive in the liberalism, toleration, and respect for law that have persisted in European society. There was therefore no abrupt end or reversal of enlightened values.

Nor had there been such a sudden beginning as is conveyed by the critic Paul Hazards celebrated aphorism: One moment the French thought like Bossuet; the next moment like Voltaire. The perceptions and propaganda of the philosophes have led historians to locate the Age of Reason within the 18th century or, more comprehensively, between the two revolutionsthe English of 1688 and the French of 1789but in conception it should be traced to the humanism of the Renaissance, which encouraged scholarly interest in Classical texts and values. It was formed by the complementary methods of the Scientific Revolution, the rational and the empirical. Its adolescence belongs to the two decades before and after 1700 when writers such as Jonathan Swift were employing the artillery of words to impress the secular intelligentsia created by the growth in affluence, literacy, and publishing. Ideas and beliefs were tested wherever reason and research could challenge traditional authority.

In a cosmopolitan culture it was the preeminence of the French language that enabled Frenchmen of the 17th century to lay the foundations of cultural ascendancy and encouraged the philosophes to act as the tutors of 18th-century Europe. The notion of a realm of philosophy superior to sectarian or national concerns facilitated the transmission of ideas. I flatter myself, wrote Denis Diderot to the Scottish philosopher David Hume, that I am, like you, citizen of the great city of the world. A philosopher, wrote Edward Gibbon, may consider Europe as a great republic, whose various inhabitants have attained almost the same level of politeness and cultivation. This magisterial pronouncement by the author of The Decline and Fall of the Roman Empire (177688) recalls the common source: the knowledge of Classical literature.

Britannica Lists & Quizzes

The scholars of the Enlightenment recognized a joint inheritance, Christian as well as Classical. In rejecting, or at least reinterpreting, the one and plundering the other, they had the confidence of those who believed they were masters of their destiny. They felt an affinity with the Classical world and saluted the achievement of the Greeks, who discovered a regularity in nature and its governing principle, the reasoning mind, as well as that of the Romans, who adopted Hellenic culture while contributing a new order and style: on their law was founded much of church and civil law. Steeped in the ideas and language of the classics but unsettled in beliefs, some Enlightenment thinkers found an alternative to Christian faith in the form of a neo-paganism. The morality was based on reason; the literature, art, and architecture were already supplying rules and standards for educated taste.

Test Your Knowledge

Economics News

The first chapter of Voltaires Sicle de Louis XIV specified the four happy ages: the centuries of Pericles and Plato, of Cicero and Caesar, of the Medicean Renaissance, and, appositely, of Louis XIV. The contrast is with the ages of belief, which were wretched and backward. Whether denouncing Gothic taste or clerical fanaticism, writers of the Enlightenment constantly resort to images of relapse and revival. Typically, Jean dAlembert wrote in the Preliminary Discourse to the Encyclopdie of a revival of letters, regeneration of ideas, and return to reason and good taste. The philosophes knew enough to be sure that they were entering a new golden age through rediscovery of the old but not enough to have misgivings about a reading of history which, being grounded in a culture that had self-evident value, provided ammunition for the secular crusade.

The new philosophy puts all in doubt, wrote the poet John Donne. Early 17th-century poetry and drama abounded in expressions of confusion and dismay about the world, God, and man. The gently questioning essays of the 16th-century French philosopher Michel de Montaigne, musing on human folly and fanaticism, continued to be popular long after his time, for they were no less relevant to the generation that suffered from the Thirty Years War. Unsettling scientific views were gaining a hold. As the new astronomy of Copernicus and Galileo, with its heliocentric view, was accepted, the firm association between religious beliefs, moral principles, and the traditional scheme of nature was shaken. In this process, mathematics occupied the central position. It was, in the words of Ren Descartes, the general science which should explain all that can be known about quantity and measure, considered independently of any application to a particular subject. It enabled its practitioners to bridge gaps between speculation and reasonable certainty: Johannes Kepler thus proceeded from his study of conic sections to the laws of planetary motion. When, however, Fontenelle wrote of Descartes, Sometimes one man gives the tone to a whole century, it was not merely of his mathematics that he was thinking. It was the system and philosophy that Descartes derived from the application of mathematical reasoning to the mysteries of the worldall that is meant by Cartesianismwhich was so influential. The method expounded in his Discourse on Method (1637) was one of doubt: all was uncertain until established by reasoning from self-evident propositions, on principles analogous to those of geometry. It was serviceable in all areas of study. There was a mechanistic model for all living things.

A different track had been pursued by Francis Bacon, the great English lawyer and savant, whose influence eventually proved as great as that of Descartes. He called for a new science, to be based on organized and collaborative experiment with a systematic recording of results. General laws could be established only when research had produced enough data and then by inductive reasoning, which, as described in his Novum Organum (1620), derives from particulars, rising by a gradual and unbroken ascent, so that it arrives at the most general axioms last of all. These must be tried and proved by further experiments. Bacons method could lead to the accumulation of knowledge. It also was self-correcting. Indeed, it was in some ways modern in its practical emphasis. Significantly, whereas the devout humanist Thomas More had placed his Utopia in a remote setting, Bacon put New Atlantis (1627) in the future. Knowledge is power, he said, perhaps unoriginally but with the conviction that went with a vision of mankind gaining mastery over nature. Thus were established the two poles of scientific endeavour, the rational and the empirical, between which enlightened man was to map the ground for a better world.

Bacons inductive method is flawed through his insufficient emphasis on hypothesis. Descartes was on strong ground when he maintained that philosophy must proceed from what is definable to what is complex and uncertain. He wrote in French rather than the customary Latin so as to exploit its value as a vehicle for clear and logical expression and to reach a wider audience. Cartesian rationalism, as applied to theology, for example by Nicholas Malebranche, who set out to refute the pantheism of Benedict de Spinoza, was a powerful solvent of traditional belief: God was made subservient to reason. While Descartes maintained his hold on French opinion, across the Channel Isaac Newton, a prodigious mathematician and a resourceful and disciplined experimenter, was mounting a crucial challenge. His Philosophiae Naturalis Principia Mathematica (1687; Mathematical Principles of Natural Philosophy) ranks with the Discourse on Method in authority and influence as a peak in the 17th-century quest for truth. Newton did not break completely with Descartes and remained faithful to the latters fundamental idea of the universe as a machine. But Newtons machine operated according to a series of laws, the essence of which was that the principle of gravitation was everywhere present and efficient. The onus was on the Cartesians to show not only that their mechanics gave a truer explanation but also that their methods were sounder. Christiaan Huygens was both a loyal disciple of Descartes and a formidable mathematician and inventor in his own right, who had worked out the first tenable theory of centrifugal force. His dilemma is instructive. He acknowledged that Newtons assumption of forces acting between members of the solar system was justified by the correct conclusions he drew from it, but he would not go on to accept that attraction was affecting every pair of particles, however minute. When Newton identified gravitation as a property inherent in corporeal matter, Huygens thought that absurd and looked for an agent acting constantly according to certain laws. Some believed that Newton was returning to occult qualities. Eccentricities apart, his views were not easy to grasp; those who actually read the Principia found it painfully difficult. Cartesianism was more accessible and appealing.

Gradually, however, Newtons work won understanding. One medium, ironically, was an outstanding textbook of Cartesian physics, Jacques Rohaults Trait de physique (1671), with detailed notes setting out Newtons case. In 1732 Pierre-Louis de Mauperthuis put the Cartesians on the defensive by his defense of Newtons right to employ a principle the cause of which was yet unknown. In 1734, in his Philosophical Letters, Voltaire introduced Newton as the destroyer of the system of Descartes. His authority clinched the issue. Newtons physics was justified by its successful application in different fields. The return of Halleys comet was accurately predicted. Charles Coulombs torsion balance proved that Newtons law of inverse squares was valid for electromagnetic attraction. Cartesianism reduced nature to a set of habits within a world of rules; the new attitude took note of accidents and circumstances. Observation and experiment revealed nature as untidy, unpredictablea tangle of conflicting forces. In classical theory, reason was presumed to be common to all human beings and its laws immutable. In Enlightenment Europe, however, there was a growing impatience with systems. The most creative of scientists, such as Boyle, Harvey, and Leeuwenhoek, found sufficient momentum for discovery on sciences front line. The controversy was creative because both rational and empirical methods were essential to progress. Like the literary battle between the ancients and the moderns or the theological battle between Jesuits and Jansenists, the scientific debate was a school of advocacy.

If Newton was supremely important among those who contributed to the climate of the Enlightenment, it is because his new system offered certainties in a world of doubts. The belief spread that Newton had explained forever how the universe worked. This cautious, devout empiricist lent the imprint of genius to the great idea of the Enlightenment: that man, guided by the light of reason, could explain all natural phenomena and could embark on the study of his own place in a world that was no longer mysterious. Yet he might otherwise have been aware more of disintegration than of progress or of theories demolished than of truths established. This was true even within the expanding field of the physical sciences. To gauge the mood of the world of intellect and fashion, of French salons or of such institutions as the Royal Society, it is essential to understand what constituted the crisis in the European mind of the late 17th century.

At the heart of the crisis was the critical examination of Christian faith, its foundations in the Bible, and the authority embodied in the church. In 1647 Pierre Gassendi had revived the atomistic philosophy of Lucretius, as outlined in On the Nature of Things. He insisted on the Divine Providence behind Epicurus atoms and voids. Critical examination could not fail to be unsettling because the Christian view was not confined to questions of personal belief and morals, or even history, but comprehended the entire nature of Gods world. The impact of scientific research must be weighed in the wider context of an intellectual revolution. Different kinds of learning were not then as sharply distinguished, because of their appropriate disciplines and terminology, as they are in an age of specialization. At that time philomaths could still be polymaths. Newtons contemporary, Gottfried Wilhelm Leibnizwhose principal contribution to philosophy was that substance exists only in the form of monads, each of which obeys the laws of its own self-determined development while remaining in complete accord with all the restinfluenced his age by concluding that since God contrived the universal harmony this world must be the best of all possible worlds. He also proposed legal reforms, invented a calculating machine, devised a method of the calculus independent of Newtons, improved the drainage of mines, and laboured for the reunification of the Roman Catholic and Lutheran churches.

The writing of John Locke, familiar to the French long before the eventual victory of his kind of empiricism, further reveals the range of interests that an educated man might pursue and its value in the outcome: discrimination, shrewdness, and originality. The journal of Lockes travels in France (167579) is studded with notes on botany, zoology, medicine, weather, instruments of all kinds, and statistics, especially those concerned with prices and taxes. It is a telling introduction to the world of the Enlightenment, in which the possible was always as important as the ideal and physics could be more important than metaphysics. Locke spent the years from 1683 to 1689 in Holland, in refuge from high royalism. There he associated with other literary exiles, who were united in abhorrence of Louis XIVs religious policies, which culminated in the revocation of the Edict of Nantes (1685) and the flight of more than 200,000 Huguenots. During this time Locke wrote the Essay on Toleration (1689). The coincidence of the Huguenot dispersion with the English revolution of 168889 meant a cross-fertilizing debate in a society that had lost its bearings. The avant-garde accepted Lockes idea that the people had a sovereign power and that the prince was merely a delegate. His Second Treatise of Civil Government (1690) offered a theoretical justification for a contractual view of monarchy on the basis of a revocable agreement between ruler and ruled. It was, however, his writings about education, toleration, and morality that were most influential among the philosophes, for whom his political theories could be only of academic interest. Locke was the first to treat philosophy as purely critical inquiry, having its own problems but essentially similar to other sciences. Voltaire admired what Locke called his historical plain method because he had not written a romance of the soul but offered a history of it. The avowed object of his Essay Concerning Human Understanding (1690) was to inquire into the original, certainty, and extent of human knowledge; together with the grounds and degrees of belief, opinion, and assent. For Locke, the mind derives the materials of reason and knowledge from experience. Unlike Descartes view that man could have innate ideas, in Lockes system knowledge consists of ideas imprinted on the mind through observation of external objects and reflection on the evidence provided by the senses. Moral values, Locke held, are derived from sensations of pleasure or pain, the mind labeling good what experience shows to give pleasure. There are no innate ideas; there is no innate depravity.

Though he suggested that souls were born without the idea of God, Locke did not reject Christianity. Sensationalism, he held, was a God-given principle that, properly followed, would lead to conduct that was ethically sound. He had, however, opened a way to disciples who proceeded to conclusions that might have been far from the masters mind. One such was the Irish bishop George Berkeley who affirmed, in his Treatise on the Principles of Human Knowledge (1710), that there was no proof that matter existed beyond the idea of it in the mind. Most philosophers after Descartes decided the question of the dualism of mind and matter by adopting a materialist position; whereas they eliminated mind, Berkeley eliminated matterand he was therefore neglected. Locke was perhaps more scientific and certainly more in tune with the intellectual and practical concerns of the age. Voltaire presented Locke as the advocate of rational faith and of sensationalist psychology; Lockes posthumous success was assured. In the debate over moral values, Locke provided a new argument for toleration. Beliefs, like other human differences, were largely the product of environment. Did it not therefore follow that moral improvement should be the responsibility of society? Finally, since human irrationality was the consequence of false ideas, instilled by faulty schooling, should not education be a prime concern of rulers? To pose those questions is to anticipate the agenda of the Enlightenment.

Continue reading here:
history of Europe - The Enlightenment | Britannica.com

Written by simmons |

August 1st, 2017 at 1:41 am

Posted in Enlightenment

Enlightenment (spiritual) – Wikipedia

Posted: at 1:41 am


Enlightenment is the "full comprehension of a situation".[web 1] The term is commonly used to denote the Age of Enlightenment,[note 1] but is also used in Western cultures in a religious context. It translates several Buddhist terms and concepts, most notably bodhi,[note 2]kensho and satori. Related terms from Asian religions are moksha (liberation) in Hinduism, Kevala Jnana in Jainism, and ushta in Zoroastrianism.

In Christianity, the word "enlightenment" is rarely used, except to refer to the Age of Enlightenment and its influence on Christianity. Roughly equivalent terms in Christianity may be illumination, kenosis, metanoia, revelation, salvation and conversion.

Perennialists and Universalists view enlightenment and mysticism as equivalent terms for religious or spiritual insight.

The English term "enlightenment" has commonly been used to translate several Sanskrit, Pali,[web 2] Chinese and Japanese terms and concepts, especially bodhi, prajna, kensho, satori and buddhahood.

Bodhi is a Theravada term. It literally means "awakening" and "understanding". Someone who is awakened has gained insight into the workings of the mind which keeps us imprisoned in craving, suffering and rebirth,[web 1] and has also gained insight into the way that leads to nirvana, the liberation of oneself from this imprisonment.

Prajna is a Mahayana term. It refers to insight into our true nature, which according to Madhyamaka is empty of a personal essence in the stream of experience. But it also refers to the Tathgata-garbha or Buddha-nature, the essential basic-consciousness beyond the stream of experience.

In Zen, kensho means "seeing into one's true nature".Satori is often used interchangeably with kensho, but refers to the experience of kensho.

Buddhahood is the attainment of full awakening and becoming a Buddha. According to the Tibetan Thubten Yeshe,[web 3] enlightenment

[means] full awakening; buddhahood. The ultimate goal of Buddhist practice, attained when all limitations have been removed from the mind and one's positive potential has been completely and perfectly realized. It is a state characterized by infinite compassion, wisdom and skill.[web 4]

In Indian religions moksha (Sanskrit: moka; liberation) or mukti (Sanskrit: ; release both from the root muc "to let loose, let go") is the final extrication of the soul or consciousness (purusha) from samsara and the bringing to an end of all the suffering involved in being subject to the cycle of repeated death and rebirth (reincarnation).

Advaita Vedanta (IAST Advaita Vednta; Sanskrit: [dait ednt]) is a philosophical concept where followers seek liberation/release by recognizing identity of the Self (Atman) and the Whole (Brahman) through long preparation and training, usually under the guidance of a guru, that involves efforts such as knowledge of scriptures, renunciation of worldy activities, and inducement of direct identity experiences. Originating in India before 788 AD, Advaita Vedanta is widely considered the most influential and most dominant[web 5] sub-school of the Vednta (literally, end or the goal of the Vedas, Sanskrit) school of Hindu philosophy. Other major sub-schools of Vednta are Viishdvaita and Dvaita; while the minor ones include Suddhadvaita, Dvaitadvaita and Achintya Bhedabheda.

Advaita (literally, non-duality) is a system of thought where "Advaita" refers to the identity of the Self (Atman) and the Whole (Brahman).[note 3] Recognition of this identity leads to liberation. Attaining this liberation takes a long preparation and training under the guidance of a guru.

The key source texts for all schools of Vednta are the Prasthanatrayithe canonical texts consisting of the Upanishads, the Bhagavad Gita and the Brahma Sutras. The first person to explicitly consolidate the principles of Advaita Vedanta was Shankara Bhagavadpada, while the first historical proponent was Gaudapada, the guru of Shankara's guru Govinda Bhagavatpada.

Shankara systematized the works of preceding philosophers. His system of Vedanta introduced the method of scholarly exegesis on the accepted metaphysics of the Upanishads. This style was adopted by all the later Vedanta schools.[citation needed]

Shankara's synthesis of Advaita Vedanta is summarized in this quote from the Vivekacmai, one of his Prakaraa grathas (philosophical treatises):[note 4]

In half a couplet I state, what has been stated by crores of texts;

that is Brahman alone is real, the world is mithy (not independently existent),

In the 19th century Vivekananda played a major role in the revival of Hinduism, and the spread of Advaita Vedanta to the West via the Ramakrishna Mission. His interpretation of Advaita Vedanta has been called "Neo-Vedanta".

In a talk on "The absolute and manifestation" given in at London in 1896 Swami Vivekananda said,

I may make bold to say that the only religion which agrees with, and even goes a little further than modern researchers, both on physical and moral lines is the Advaita, and that is why it appeals to modern scientists so much. They find that the old dualistic theories are not enough for them, do not satisfy their necessities. A man must have not only faith, but intellectual faith too".[web 6]

Vivekananda emphasized samadhi as a means to attain liberation. Yet this emphasis is not to be found in the Upanishads nor in Shankara. For Shankara, meditation and Nirvikalpa Samadhi are means to gain knowledge of the already existing unity of Brahman and Atman, not the highest goal itself:

[Y]oga is a meditative exercise of withdrawal from the particular and identification with the universal, leading to contemplation of oneself as the most universal, namely, Consciousness. This approach is different from the classical yoga of complete thought suppression.

Vivekenanda's modernisation has been criticized:

Without calling into question the right of any philosopher to interpret Advaita according to his own understanding of it, [...] the process of Westernization has obscured the core of this school of thought. The basic correlation of renunciation and Bliss has been lost sight of in the attempts to underscore the cognitive structure and the realistic structure which according to Samkaracarya should both belong to, and indeed constitute the realm of my.

Neo-Advaita is a new religious movement based on a modern, Western interpretation of Advaita Vedanta, especially the teachings of Ramana Maharshi. Neo-Advaita is being criticized[note 6][note 7][note 8] for discarding the traditional prerequisites of knowledge of the scriptures and "renunciation as necessary preparation for the path of jnana-yoga". Notable neo-advaita teachers are H. W. L. Poonja, his students GangajiAndrew Cohen,[note 9], Madhukar[23] and Eckhart Tolle.

The prime means to reach moksha is through the practice of yoga (Sanskrit, Pli: , /j/, yoga) is a commonly known generic term for physical, mental, and spiritual disciplines which originated in ancient India. Specifically, yoga is one of the six stika ("orthodox") schools of Hindu philosophy. It is based on the Yoga Stras of Patajali. Various traditions of yoga are found in Hinduism, Buddhism, Jainism and Sikhism.[note 10]

Prephilosophical speculations and diverse ascetic practices of first millennium BCE were systematized into a formal philosophy in early centuries CE by the Yoga Sutras of Patanjali. By the turn of the first millennium, Hatha yoga emerged as a prominent tradition of yoga distinct from the Patanjali's Yoga Sutras. While the Yoga Sutras focus on discipline of the mind, Hatha yoga concentrates on health and purity of the body.

Hindu monks, beginning with Swami Vivekananda, brought yoga to the West in the late 19th century. In the 1980s, yoga became popular as a physical system of health exercises across the Western world. Many studies have tried to determine the effectiveness of yoga as a complementary intervention for cancer, schizophrenia, asthma and heart patients. In a national survey, long-term yoga practitioners in the United States reported musculoskeletal and mental health improvements.

Classical Advaita Vedanta emphasises the path of jnana yoga, a progression of study and training to attain moksha. It consists of four stages:[32][web 12]

The paths of bhakti yoga and karma yoga are subsidiary.

In bhakti yoga, practice centers on the worship God in any way and in any form, like Krishna or Ayyappa. Adi Shankara himself was a proponent of devotional worship or Bhakti. But Adi Shankara taught that while Vedic sacrifices, puja and devotional worship can lead one in the direction of jnana (true knowledge), they cannot lead one directly to moksha. At best, they can serve as means to obtain moksha via shukla gati.[citation needed]

Karma yoga is the way of doing our duties, in disregard of personal gains or losses. According to Sri Swami Sivananda,

Karma Yoga is consecration of all actions and their fruits unto the Lord. Karma Yoga is performance of actions dwelling in union with the Divine, removing attachment and remaining balanced ever in success and failure.

Karma Yoga is selfless service unto humanity. Karma Yoga is the Yoga of action which purifies the heart and prepares the Antahkarana (the heart and the mind) for the reception of Divine Light or attainment if Knowledge of the Self. The important point is that you will have to serve humanity without any attachment or egoism.[web 15]

Jainism (; Sanskrit: Jainadharma, Tamil: Samaam, Bengali: Jainadharma, Telugu: Jainamata, Malayalam: Jainmat, Kannada: Jaina dharma), is an Indian religion that prescribes a path of non-violence towards all living beings. Its philosophy and practice emphasize the necessity of self-effort to move the soul toward divine consciousness and liberation. Any soul that has conquered its own inner enemies and achieved the state of supreme being is called a jina ("conqueror" or "victor"). The ultimate status of these perfect souls is called siddha. Ancient texts also refer to Jainism as shramana dharma (self-reliant) or the "path of the nirganthas" (those without attachments or aversions).

In Jainism highest form of pure knowledge a soul can attain is called Kevala Jnana ( Sanskrit: )or Kevala a (Prakrit: ). which means absolute or perfect and Jna, which means "knowledge". Kevala is the state of isolation of the jva from the ajva attained through ascetic practices which burn off one's karmic residues, releasing one from bondage to the cycle of death and rebirth. Kevala Jna thus means infinite knowledge of self and non-self, attained by a soul after annihilation of the all ghtiy karmas. The soul which has reached this stage achieves moksa or liberation at the end of its life span.

Mahavira, 24th thirthankara of Jainism, is said to have practised rigorous austerities for 12 years before he attained enlightenment,

During the thirteenth year, in the second month of summer, in the fourth fortnight, the light (fortnight) of Vaisakha, on its tenth day, when the shadow had turned towards the east and the first wake was over, on the day called Suvrata, in the Muhurta called Vigaya, outside of the town Grimbhikagrama on the bank of the river Rjupalika, not far from an old temple, in the field of the householder Samaga, under a Sal tree, when the moon was in conjunction with the asterism Uttara Phalguni, (the Venerable One) in a squatting position with joined heels, exposing himself to the heat of the sun, after fasting two and a half days without drinking water, being engaged in deep meditation, reached the highest knowledge and intuition, called Kevala, which is infinite, supreme, unobstructed, unimpeded, complete, and full.[citation needed]

Kevala Jna is one of the five major events in the life of a Tirthankara and is known as Jna Kalyanaka and supposedly celebrated by all gods. Mahaviras Kaivalya was said to have been celebrated by the demi-gods, who constructed the Samosarana or a grand preaching assembly for him.

In the Western world the concept of enlightenment in a religious context acquired a romantic meaning. It has become synonymous with self-realization and the true self, which is being regarded as a substantial essence which is covered over by social conditioning.[note 12]

The use of the Western word enlightenment is based on the supposed resemblance of bodhi with Aufklrung, the independent use of reason to gain insight into the true nature of our world. As a matter of fact there are more resemblances with Romanticism than with the Enlightenment: the emphasis on feeling, on intuitive insight, on a true essence beyond the world of appearances.

The equivalent term "awakening" has also been used in a Christian context,[35] namely the Great Awakenings, several periods of religious revival in American religious history. Historians and theologians identify three or four waves of increased religious enthusiasm occurring between the early 18th century and the late 19th century. Each of these "Great Awakenings" was characterized by widespread revivals led by evangelical Protestant ministers, a sharp increase of interest in religion, a profound sense of conviction and redemption on the part of those affected, an increase in evangelical church membership, and the formation of new religious movements and denominations.

Another equivalent term is Illuminationism, which was also used by Paul Demieville in his work The Mirror of the Mind, in which he made a distinction between "illumination subie" and "illumination graduelle".[web 16] Illuminationism is a doctrine according to which the process of human thought needs to be aided by divine grace. It is the oldest and most influential alternative to naturalism in the theory of mind and epistemology.[37] It was an important feature of ancient Greek philosophy, Neoplatonism, medieval philosophy, and in particular, the Illuminationist school of Islamic philosophy.

Augustine was an important proponent of Illuminationism, stating that everything we know is taught to us by God as He casts His light over the world,[web 17] saying that "The mind needs to be enlightened by light from outside itself, so that it can participate in truth, because it is not itself the nature of truth. You will light my lamp, Lord [38] and "You hear nothing true from me which you have not first told me.[39] Augustine's version of illuminationism is not that God gives us certain information, but rather gives us insight into the truth of the information we received for ourselves.

This romantic idea of enlightenment as insight into a timeless, transcendent reality has been popularized especially by D.T. Suzuki.[web 18][web 19] Further popularization was due to the writings of Heinrich Dumoulin.[web 20] Dumoulin viewed metaphysics as the expression of a transcendent truth, which according to him was expressed by Mahayana Buddhism, but not by the pragmatic analysis of the oldest Buddhism, which emphasizes anatta. This romantic vision is also recognizable in the works of Ken Wilber.

In the oldest Buddhism this essentialism is not recognizable.[web 21] According to critics it doesn't really contribute to a real insight into Buddhism:[web 22]

...most of them labour under the old clich that the goal of Buddhist psychological analysis is to reveal the hidden mysteries in the human mind and thereby facilitate the development of a transcendental state of consciousness beyond the reach of linguistic expression.

A common reference in Western culture is the notion of "enlightenment experience". This notion can be traced back to William James, who used the term "religious experience" in his book, The Varieties of Religious Experience.Wayne Proudfoot traces the roots of the notion of "religious experience" further back to the German theologian Friedrich Schleiermacher (17681834), who argued that religion is based on a feeling of the infinite. The notion of "religious experience" was used by Schleiermacher to defend religion against the growing scientific and secular citique.

It was popularised by the Transcendentalists, and exported to Asia via missionaries. Transcendentalism developed as a reaction against 18th Century rationalism, John Locke's philosophy of Sensualism, and the predestinationism of New England Calvinism. It is fundamentally a variety of diverse sources such as Hindu texts like the Vedas, the Upanishads and the Bhagavad Gita, various religions, and German idealism.

It was adopted by many scholars of religion, of which William James was the most influential.[note 13]

The notion of "experience" has been criticised. Robert Sharf points out that "experience" is a typical Western term, which has found its way into Asian religiosity via western influences.[note 14] The notion of "experience" introduces a false notion of duality between "experiencer" and "experienced", whereas the essence of kensho is the realisation of the "non-duality" of observer and observed. "Pure experience" does not exist; all experience is mediated by intellectual and cognitive activity. The specific teachings and practices of a specific tradition may even determine what "experience" someone has, which means that this "experience" is not the proof of the teaching, but a result of the teaching. A pure consciousness without concepts, reached by "cleaning the doors of perception",[note 15] would be an overwhelming chaos of sensory input without coherence.

Nevertheless, the notion of religious experience has gained widespread use in the study of religion, and is extensively researched.

The word "enlightenment" is not generally used in Christian contexts for religious understanding or insight. More commonly used terms in the Christian tradition are religious conversion and revelation.

Lewis Sperry Chafer (18711952), one of the founders of Dispensationalism, uses the word "illuminism". Christians who are "illuminated" are of two groups, those who have experienced true illuminism (biblical) and those who experienced false illuminism (not from the Holy Spirit).

Christian interest in eastern spirituality has grown throughout the 20th century. Notable Christians, such as Hugo Enomiya-Lassalle and AMA Samy, have participated in Buddhist training and even become Buddhist teachers themselves. In a few places Eastern contemplative techniques have been integrated in Christian practices, such as centering prayer.[web 24] But this integration has also raised questions about the borders between these traditions.[web 25]

Western and Mediterranean culture has a rich tradition of esotericism and mysticism. The Perennial philosophy, basic to the New Age understanding of the world, regards those traditions as akin to Eastern religions which aim at awakening/ enlightenment and developing wisdom. The hypothesis that all mystical traditions share a "common core", is central to New Age, but contested by a diversity of scientists like Katz and Proudfoot.

Judaism includes the mystical tradition of Kabbalah. Islam includes the mystical tradition of Sufism. In the Fourth Way teaching, enlightenment is the highest state of Man (humanity).

A popular western understanding sees "enlightenment" as "nondual consciousness", "a primordial, natural awareness without subject or object".[web 26] It is used interchangeably with Neo-Advaita.

This nondual consciousness is seen as a common stratum to different religions. Several definitions or meanings are combined in this approach, which makes it possible to recognize various traditions as having the same essence. According to Renard, many forms of religion are based on an experiential or intuitive understanding of "the Real"

This idea of nonduality as "the central essence" is part of a modern mutual exchange and synthesis of ideas between western spiritual and esoteric traditions and Asian religious revival and reform movements.[note 16] Western predecessors are, among others, New Age,Wilber's synthesis of western psychology and Asian spirituality, the idea of a Perennial Philosophy, and Theosophy. Eastern influences are the Hindu reform movements such as Aurobindo's Integral Yoga and Vivekananda's Neo-Vedanta, the Vipassana movement, and Buddhist modernism. A truly syncretistic influence is Osho and the Rajneesh movement, a hybrid of eastern and western ideas and teachings, and a mainly western group of followers.

"Religious experiences" have "evidential value",[77] since they confirm the specific worldview of the experiencer:[78]

These experiences are cognitive in that, allegedly at least, the subject of the experience receives a reliable and accurate view of what, religiously considered, are the most important features of things. This, so far as their religious tradition is concerned, is what is most important about them. This is what makes them "salvific" or powerful to save.[79]

Yet, just like the very notion of "religious experience" is shaped by a specific discourse and habitus, the "uniformity of interpretation" may be due to the influence of religious traditions which shape the interpretation of such experiences.[78]

Yandell discerns various "religious experiences" and their corresponding doctrinal settings, which differ in structure and phenomenological content, and in the "evidential value" they present.[82] Yandell discerns five sorts:[83]

Various philosophers and cognitive scientists state that there is no "true self" or a "little person" (homunculus) in the brain that "watches the show," and that consciousness is an emergent property that arise from the various modules of the brain in ways that are yet far from understood.[90] According to Susan Greenfield, the "self" may be seen as a composite, whereas Douglas R. Hofstadter describes the sense of "I" as a result of cognitive process.

This is in line with the Buddhist teachings, which state that

[...] what we call 'I' or 'being,' is only a combination of physical and mental aggregates which are working together interdependently in a flux of momentary change within the law of cause and effect, and that there is nothing, permanent, everlasting, unchanging, and eternal in the whole of existence.

To this end, Parfit called Buddha the "first bundle theorist".

The idea that the mind is the result of the activities of neurons in the brain was most notably popularized by Francis Crick, the co-discoverer of DNA, in his book The Astonishing Hypothesis.[note 17] The basic idea can be traced back to at least tienne Bonnot de Condillac. According to Crick, the idea was not a novel one:

[...] an exceptionally clear statement of it can be found in a well known paper by Horace Barlow.

Several users of entheogens throughout the ages have claimed spiritual enlightenment with the use of these substances, their use and prevalence through history is well recorded, and continues today. In modern times we have seen increased interest in these practices, for example the rise of interest in Ayahuasca. The psychological effects of these substances have been subject to scientific research focused on understanding their physiological basis.

Read more here:
Enlightenment (spiritual) - Wikipedia

Written by grays |

August 1st, 2017 at 1:41 am

Posted in Enlightenment

From the Enlightenment to the Dark Ages: How new atheism slid … – Salon

Posted: at 1:41 am


The new atheist movement emerged shortly after the 9/11 attacks with a best-selling book by Sam Harris called The End of Faith. This was followed by engaging tomes authored by Richard Dawkins, Daniel Dennett and the late Christopher Hitchens, among others. Avowing to champion the values of science and reason, the movement offered a growing number of unbelievers tired of faith-based foolishness mucking up society for the rest of us some hope for the future. For many years I was among the new atheism movements greatest allies.

From the start, though, the movement had some curious quirks. Although many atheists are liberals and empirical studies link higher IQs to both liberalism and atheism, Hitchens gradually abandoned his Trotskyist political affiliations for what could, in my view, be best described as a neoconservative outlook. Indeed, he explicitly endorsed the 2003 U.S. invasion of Iraq, now widely seen as perhaps the greatest foreign policy blunder in American history.

There were also instances in which critiques of religion, most notably Islam, went beyond what was both intellectually warranted and strategically desirable. For example, Harris wrote in a 2004 Washington Times op-ed that We are at war with Islam. He added a modicum of nuance in subsequent sentences, but I know of no experts on Islamic terrorism who would ever suggest that uttering such a categorical statement in a public forum is judicious. As the terrorism scholar Will McCant noted in an interview that I conducted with him last year, there are circumstances in which certain phrases even if true are best not uttered, since they are unnecessarily incendiary. In what situation would claiming that the West is engaged in a civilizational clash with an entire religion actually improve the expected outcome?

Despite these peccadilloes, if thats what they are, new atheism still had much to offer. Yet the gaffes kept on coming, to the point that no rational person could simply dismiss them as noise in the signal. For example, Harris said in 2014 that new atheism was dominated by men because it lacks the nurturing, coherence-building extra estrogen vibe that you would want by default if you wanted to attract as many women as men.

This resulted in an exodus of women from the movement who decided that the new atheist label was no longer for them. (I know of many diehard atheist women who wantednothing to do with new atheism, which is a real shame.) Harris attempted self-exoneration didnt help, either it merely revealed a moral scotoma in his understanding of gender, sexism and related issues. What he should have done is, quite simply, said Im sorry. These words, I have come to realize, are nowhere to be found in the new atheist lexicon.

Subsequent statements about profiling at airports, serious allegations of rape at atheist conferences, and tweets from major leaders that (oops!) linked to white supremacist websites further alienated women, people of color and folks that one could perhaps describe as morally normal. Yet some of us mostly white men like myself persisted in our conviction that, overall, the new atheist movement was still a force for good in the world. It is an extraordinary personal embarrassment that I maintained this view until the present year.

For me, it was a series of recent events that pushed me over the edge. As a philosopher someone who cares deeply about intellectual honesty, verifiable evidence, critical thinking and moral thoughtfulness I now find myself in direct opposition with many new atheist leaders. That is, I see my own advocacy for science, critical thought and basic morality as standing in direct opposition to their positions.

Just consider a recent tweet from one of the most prominent new atheist luminaries, Peter Boghossian: Why is it that nearly every male whos a 3rd wave intersectional feminist is physically feeble & has terrible body habitus? If this is what it means to be a reasonable person, then who would want to be that? Except for the vocabulary, that looks like something youd find in Donald Trumps Twitter feed. The same goes for another of Boghossians deep thoughts: Ive never understood how someone could be proud of being gay. How can one be proud of something one didnt work for? Its hard to know where to even begin dissecting this bundle of shameful ignorance.

More recently, Boghossian and his sidekick James Lindsay published a hoax academic paper in a gender studies journal (except that it wasnt) in an attempt to embarrass the field of gender studies, which they having no expertise in the field believe is dominated by a radical feminist ideology that sees the penis as the root of all evil. Ive explained twice why this hoax actually just revealed a marked lack of skepticism among skeptics themselves, so I wont go further into the details here. Suffice it to say that while bemoaning the sloppy scholarship of gender studies scholars, Boghossian and Lindsays explanation of the hoax in a Skeptic article contained philosophical mistakes that a second-year undergraduate could detect. Even more, their argument for how the hoax paper exposes gender studies as a fraud contains a demonstrable fatal error that is, it gets a crucial fact wrong, thus rendering their argument unsound.

The point is this: One would expect skeptics, of all people, who claim to be responsive to the evidence, to acknowledge this factual error. Yet not a single leader of the new atheist movement has publicly mentioned the factual problems with the hoax. Had someone (or preferably all of them) done this, it would have affirmed the new atheist commitment to intellectual honesty, to putting truth before pride and epistemology before ideology, thereby restoring its damaged credibility.

Even worse, Boghossian and Lindsay explicitly argue, in response to some critics, that they dont need to know the field of gender studies to criticize it. This is, properly contextualized, about as anti-intellectual as one can get. Sure, it is a fallacy to immediately dismiss someones criticisms of a topic simply because that person doesnt have a degree on the topic. Doing this is called the Courtiers Reply. But it decidedly isnt a fallacy to criticize someone for being incredibly ignorant and even ignorant of their own ignorance regarding an issue theyre making strong, confident-sounding claims about. Kids, listen to me: Knowledge is a good thing, despite what Boghossian and Lindsay suggest, and you should always work hard to understand a position before you level harsh criticisms at it. Otherwise youll end up looking like a fool to those in the know.

Along these lines, the new atheist movement has flirted with misogyny for years. Harris estrogen vibe statement which yielded a defense rather than a gracious apology was only the tip of the iceberg. As mentioned above, there have been numerous allegations of sexual assault, and atheist conferences have pretty consistently been male-dominated resulting in something like a gender Matthew effect.

Many leading figures have recently allied themselves with small-time television personality Dave Rubin, a guy who has repeatedly given Milo Yiannopoulos the professional right-wing troll who once said that little boys would stop complaining about being raped by Catholic priests if the priests were as good-looking as he is a platform on his show. In a tweet from last May, Rubin said Id like a signed copy, please in response to a picture that reads: Ah. Peace and quiet. #ADayWithoutAWoman. If, say, Paul Ryan were asked, hed describe this as sort of like the textbook definition of a misogynistic comment. Did any new atheist leaders complain about this tweet? Of course not, much to the frustration of critical thinkers like myself who actually care about how women are treated in society.

In fact, the magazine Skeptic just published a glowing review of Yiannopoulos recent book, Dangerous. The great irony of this intellectual misstep is that Yiannopoulos embodies the opposite of nearly every trend of moral progress that Michael Shermer, the editor of Skeptic, identifies in his book The Moral Arc.

Yiannopoulos is a radical anti-intellectual, often ignoring facts or simply lying about issues; he uses hyperbolic rhetoric (e.g., feminism is cancer) that stymies rather than promotes rational discussion; he holds some outright racist views; he professes nonsensical views, such as the idea that birth control makes women unattractive and crazy; he uses hate speech, which indicates that hes not a very nice person; he once publicly called out a transgender student by name during a talk; and he supports Donald Trump, who has essentially led a society-wide campaign against the Enlightenment. Oh, and need I mention that Yiannopoulos once said that if it werent for his own experience of abuse by a Catholic priest, he never would have learned to give such good head? The merger between the alt-right and the new atheist movement continues to solidify.

Perhaps the most alarming instance of irrationality in recent memory, though, is Sam Harris recent claim that black people are less intelligent than white people. This emerged from a conversation that Harris had with Charles Murray, co-author of The Bell Curve and a monetary recipient of the racist Pioneer Fund. There are two issues worth dwelling upon here. The first is scientific: Despite what Harris asserts, science does not support the conclusion that there are gene-based IQ differences between the races. To confirm this, I emailed the leading psychologist Howard Gardner, who told me that The racial difference speculations of Herrnstein and Murray remain very controversial, as well as James Flynn (world-renowned for the Flynn effect), who responded that, Taking into account the range of evidence, I believe that black and white Americans are not distinguished by genes for IQ. However, the debate is ongoing.

The point is simply this: Scottish philosopher David Hume famously declared that the wise person always proportions her beliefs to the evidence. It follows that when a community of experts is divided on an issue, it behooves the rational non-expert to hold her opinion in abeyance. In direct opposition of this epistemic principle, Harris takes a firm stand on race and intelligence even receiving adulation for doing this from other white men in the new atheist community. A more thoughtful public intellectual would have said: Look, this is a very complicated issue that leading psychologists disagree about. A minority say there is a genetically based correlation between race and IQ while many others claim just the opposite, with perhaps the largest group holding that we simply dont know enough right now. Since I am rational, I too will say that we simply dont know.

The second issue is ethical: Is it right, wise or justified to publicly declare that one race is genetically inferior to another, given the immense societal consequences this could have? Not only could this claim empower white supremacists individuals who wouldnt be sympathetic with Harris follow-up claim that generalizations about a race of people dont warrant discriminating against individual members of that race but science tells us that such information can have direct and appreciable negative consequences for members of the targeted race. For example, stereotype threat describes how the mere mention that ones racial class is inferior can have measurable detrimental effects on ones cognitive performance. Similarly, teacher expectancy effects refer to this; if teachers are told that some students are smart and others are dumb, where the smart and dumb labels are randomly assigned, the smart students will statistically do better in class than the dumb ones.

To broadcast a scientifically questionable meme that could have serious bad effects for people already struggling in a society that was founded upon racism and is still struggling to overcome it is, I would argue, the height of intellectual irresponsibility.

Although the new atheist movement once filled me with a great sense of optimism about the future of humanity, this is no longer the case. Movements always rise and fall they have a life cycle, of sorts but the fall of this movement has been especially poignant for me. The new atheists of today would rather complain about trigger warnings in classrooms than eliminate rape on campuses. Theyd rather whine about safe spaces than help transgender people feel accepted by society. They loudly claim to support free speech and yet routinely ban dissenters from social media, blogs and websites.

They say they care about facts, yet refuse to change their beliefs when inconvenient data are presented. They decry people who make strong assertions outside of their field and yet feel perfectly entitled to make fist-poundingly confident claims about issues they know little about. And they apparently dont give a damn about alienating women and people of color, a truly huge demographic of potential allies in the battle against religious absurdity.

On a personal note, a recent experience further cemented my view that the new atheists are guilty of false advertising. A podcaster named Lalo Dagach saw that I had criticized Harris understanding of Islamic terrorism, which I believe lacks scholarly rigor. In response, he introduced me to his Twitter audience of 31,000 people as follows: Phil Torres (@xriskology) everyone. Mourns the loss of ISIS and celebrates attacks on atheists. Below this tweet was a screenshot of the last two articles I had written for Salonone about the importance of listening to the experts on terrorism, and the other about how the apocalyptic ideology of the Islamic extremists of ISIS is more likely to evolve into new forms than go extinct.

First of all, Dagachs tweet was overtly defamatory. I wrote him asking for a public apology and heard nothing back, although he quietly deleted the tweet. But even that did not happen until I had received a hailstorm of disturbing responses to Dagachs false statements, responses in the form of internet trolls aggressively defending Harris by asking me to kill myself and proposing new nicknames like Phil Hitler Torres (seriously!). This is the new atheist movement today, by and large. The great enemy of critical thinking and epistemological integrity, namely tribalism, has become the social glue of the community.

I should still be the new atheist movements greatest ally, yet today I want nothing whatsoever to do with it. From censoring people online while claiming to support free speech to endorsing scientifically unfounded claims about race and intelligence to asserting, as Harris once did, that the profoundly ignorant Ben Carson would make a better president than the profoundly knowledgeable Noam Chomsky, the movement has repeatedly shown itself to lack precisely the values it once avowed to uphold. Words that now come to mind when I think of new atheism are un-nuanced, heavy-handed, unjustifiably confident and resistant to evidence not to mention, on the whole, misogynist and racist.

And while there are real and immensely important issuesto focus on in the world, such as climate change, nuclear proliferation, food production, ocean acidification, the sixth mass extinction and so on, even the most cursory glance at any leading new atheists social-media feed reveals a bizarre obsession with what they call the regressive left. This is heartbreaking, because humanity needs thoughtful, careful, nuanced, scientifically minded thinkers more now than ever before.

See original here:
From the Enlightenment to the Dark Ages: How new atheism slid ... - Salon

Written by admin |

August 1st, 2017 at 1:41 am

Posted in Enlightenment

Enlightenment, Rogue-like Game Set To Hit Steam On August 4th … – One Angry Gamer (blog)

Posted: at 1:41 am


(Last Updated On: July 30, 2017)

Fans of rogue-like games have yet another title to look forward to that will be hitting Steam Early Access very soon, which is Coconut Island Games and LizardKings Enlightenment. The game is set to hit PC via Steam come August 4th.

Enlightenment is an action-shooter game with heavy doses of rogue-like features nestled into its core. The narrative is said to be unlinear and tasks players on an adventure into a wasteland, flaunting a mysterious dungeon known as The Ark. Enlightenment invites the curious and fans of rogue-like genres to indulge in a risky journey plagued by a monstrous crisis.

As for the story of Enlightenment, the content explaining said rogue-like game lies below for you to read over:

An asteroid wiped out civilization as we know it. Some wasteland tramp discovered that the asteroid shards grant possessors unexplained powers; so they founded this cult, calling it the Scientific Church of Enlightenment and this Church of Enlightenment built the Ark and they built a whole city around it. Its gonna be where the restoration of humanity starts, they said. But just look around you; these streets are all empty, not a soul to be seen at all now.

The underground settlement often referred to as The Ark is a dim complex that houses the dead bodies and the debased minded who dared to enter unprepared. The game tests the skill of players to see if they are yet another collection to the complex or a hero in the making.

In Enlightenment, you play in a roguelike action-shooter with a fast-paced challenging journey deep into a underground complex called the Ark. Players will be tested by large varieties of enemies in a procedurally-generated dungeon, and become stronger by learning from the inevitable deaths.

The developers not too long ago posted up the latest Enlightenment trailer along with its Steam Early Access page, which the former is up for your viewing pleasure.

As noted above, Enlightenment is set to drop on August 4th for PC via Steam Early Access. Additional information on this game can be found by checking out its main site.

Related

Link:
Enlightenment, Rogue-like Game Set To Hit Steam On August 4th ... - One Angry Gamer (blog)

Written by admin |

August 1st, 2017 at 1:41 am

Posted in Enlightenment

Personal computer – Wikipedia

Posted: July 30, 2017 at 2:32 pm


A personal computer (PC) is a multi-purpose electronic computer whose size, capabilities, and price make it feasible for individual use. PCs are intended to be operated directly by an end user, rather than by a computer expert or technician.

"Computers were invented to 'compute': to solve complex mathematical problems, but today, due to media dependency and the everyday use of computers, it is seen that 'computing' is the least important thing computers do."[1] The computer time-sharing models that were typically used with larger, more expensive minicomputer and mainframe systems, to enable them be used by many people at the same time, are not used with PCs.

Early computer owners in the 1960s, invariably institutional or corporate, had to write their own programs to do any useful work with the machines. In the 2010s, personal computer users have access to a wide range of commercial software, free software ("freeware") and free and open-source software, which are provided in ready-to-run form. Software for personal computers is typically developed and distributed independently from the hardware or OS manufacturers.[2] Many personal computer users no longer need to write their own programs to make any use of a personal computer, although end-user programming is still feasible. This contrasts with systems such as smartphones or tablet computers, where software is often only available through a manufacturer-supported channel, and end-user program development may be discouraged by lack of support by the manufacturer.

Since the early 1990s, Microsoft operating systems and Intel hardware have dominated much of the personal computer market, first with MS-DOS and then with Windows. Alternatives to Microsoft's Windows operating systems occupy a minority share of the industry. These include Apple's macOS and free open-source Unix-like operating systems such as Linux. Advanced Micro Devices (AMD) provides the main alternative to Intel's processors.

"PC" is an initialism for "personal computer". The IBM Personal Computer incorporated the designation in its model name, but IBM has not used this brand for many years. It is sometimes useful, especially in a marketing context, to distinguish personal computers of the "IBM Personal Computer" family from personal computers made by other manufacturers. For example, "PC" is used in contrast with "Mac", an Apple Macintosh computer.[3][4][5][6] This sense of the word is used in the Get a Mac advertisement campaign that ran between 2006 and 2009, as well as its rival, I'm a PC campaign, that appeared in 2008. Since none of these Apple products were mainframes or time-sharing systems, they were all "personal computers" and not "PC" (brand) computers.

The brain [computer] may one day come down to our level [of the common people] and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far.

In the history of computing there were many examples of computers designed to be used by one person, as opposed to terminals connected to mainframe computers. It took a while for computers to be developed that meet the modern definition of a "personal computers", one that is designed for one person, is easy to use, and is cheap enough for an individual to buy.[8]

Using the narrow definition of "operated by one person", the first personal computer was the ENIAC which became operational in 1946.[9] It did not meet further definitions of affordable or easy to use.

An example of an early single-user computer was the LGP-30, created in 1956 by Stan Frankel and used for science and engineering as well as basic data processing.[10] It came with a retail price of $47,000equivalent to about $414,000 today.[11]

Introduced at the 1965 New York Worlds Fair, the Programma 101 was a printing programmable calculator[12][13] described in advertisements as a "desktop computer".[14][15][16][17] It was manufactured by the Italian company Olivetti and invented by the Italian engineer Pier Giorgio Perotto, inventor of the magnetic card system for program storage.[citation needed]

The Soviet MIR series of computers was developed from 1965 to 1969 in a group headed by Victor Glushkov. It was designed as a relatively small-scale computer for use in engineering and scientific applications and contained a hardware implementation of a high-level programming language. Another innovative feature for that time was the user interface combining a keyboard with a monitor and light pen for correcting texts and drawing on screen.[18] In what was later to be called the Mother of All Demos, SRI researcher Douglas Engelbart in 1968 gave a preview of what would become the staples of daily working life in the 21st century: e-mail, hypertext, word processing, video conferencing and the mouse. The demonstration required technical support staff and a mainframe time-sharing computer that were far too costly for individual business use at the time.

By the early 1970s, people in academic or research institutions had the opportunity for single-person use of a computer system in interactive mode for extended durations, although these systems would still have been too expensive to be owned by a single person. Early personal computersgenerally called microcomputerswere often sold in a kit form and in limited volumes, and were of interest mostly to hobbyists and technicians. Minimal programming was done with toggle switches to enter instructions, and output was provided by front panel lamps. Practical use required adding peripherals such as keyboards, computer displays, disk drives, and printers. Micral N was the earliest commercial, non-kit microcomputer based on a microprocessor, the Intel 8008. It was built starting in 1972 and about 90,000 units were sold. This had been preceded by the Datapoint 2200 in 1970, for which the Intel 8008 had been commissioned, though not accepted for use. The CPU design implemented in the Datapoint 2200 became the basis for x86 architecture used in the original IBM PC and its descendants.[19]

In 1973 the IBM Los Gatos Scientific Center developed a portable computer prototype called SCAMP (Special Computer APL Machine Portable) based on the IBM PALM processor with a Philips compact cassette drive, small CRT and full function keyboard. SCAMP emulated an IBM 1130 minicomputer in order to run APL1130.[20] In 1973 APL was generally available only on mainframe computers, and most desktop sized microcomputers such as the Wang 2200 or HP 9800 offered only BASIC. Because SCAMP was the first to emulate APL1130 performance on a portable, single user computer, PC Magazine in 1983 designated SCAMP a "revolutionary concept" and "the world's first personal computer".[20][21] This seminal, single user portable computer now resides in the Smithsonian Institution, Washington, D.C.. Successful demonstrations of the 1973 SCAMP prototype led to the IBM 5100 portable microcomputer launched in 1975 with the ability to be programmed in both APL and BASIC for engineers, analysts, statisticians and other business problem-solvers. In the late 1960s such a machine would have been nearly as large as two desks and would have weighed about half a ton.[20]

A seminal step in personal computing was the 1973 Xerox Alto, developed at Xerox's Palo Alto Research Center (PARC). It had a graphical user interface (GUI) which later served as inspiration for Apple Computer's Macintosh, and Microsoft's Windows operating system. The Alto was a demonstration project, not commercialized, as the parts were too expensive to be affordable.[8]

Also in 1973 Hewlett Packard introduced fully BASIC programmable microcomputers that fit entirely on top of a desk, including a keyboard, a small one-line display and printer. The Wang 2200 microcomputer of 1973 had a full-size cathode ray tube (CRT) and cassette tape storage.[22] These were generally expensive specialized computers sold for business or scientific uses. The introduction of the microprocessor, a single chip with all the circuitry that formerly occupied large cabinets, led to the proliferation of personal computers after 1975.

1974 saw the introduction of what is considered by many to be the first true "personal computer", the Altair 8800 created by Micro Instrumentation and Telemetry Systems (MITS).[23][24] Based on the 8-bit Intel 8080 Microprocessor,[25] the Altair is widely recognized as the spark that ignited the microcomputer revolution[26] as the first commercially successful personal computer.[27] The computer bus designed for the Altair was to become a de facto standard in the form of the S-100 bus, and the first programming language for the machine was Microsoft's founding product, Altair BASIC.[28][29]

In 1976, Steve Jobs and Steve Wozniak sold the Apple I computer circuit board, which was fully prepared and contained about 30 chips. The Apple I computer differed from the other kit-style hobby computers of era. At the request of Paul Terrell, owner of the Byte Shop, Steve Jobs was given his first purchase order, for 50 Apple I computers, only if the computers were assembled and tested and not a kit computer. Terrell wanted to have computers to sell to a wide range of users, not just experienced electronics hobbyists who had the soldering skills to assemble a computer kit. The Apple I as delivered was still technically a kit computer, as it did not have a power supply, case, or keyboard as it was delivered to the Byte Shop.

The first successfully mass marketed personal computer was the Commodore PET introduced in January 1977. However, it was back-ordered and not available until later in the year.[30] Five months later (June), the Apple II (usually referred to as the "Apple") was introduced,[31] and the TRS-80 from Tandy Corporation / Tandy Radio Shack in summer 1977, delivered in September in a small number. Mass-market ready-assembled computers allowed a wider range of people to use computers, focusing more on software applications and less on development of the processor hardware.

During the early 1980s, home computers were further developed for household use, with software for personal productivity, programming and games. They typically could be used with a television already in the home as the computer display, with low-detail blocky graphics and a limited color range, and text about 40 characters wide by 25 characters tall. Sinclair Research,[32] a UK company, produced the ZX Seriesthe ZX80 (1980), ZX81 (1981), and the ZX Spectrum; the latter was introduced in 1982, and totaled 8 million unit sold. Following came the Commodore 64, totaled 17 million units sold.[33][34]

In the same year, the NEC PC-98 was introduced, which was a very popular personal computer that sold in more than 18 million units.[35] Another famous personal computer, the revolutionary Amiga 1000, was unveiled by Commodore on July 23, 1985. The Amiga 1000 featured a multitasking, windowing operating system, color graphics with a 4096-color palette, stereo sound, Motorola 68000 CPU, 256KB RAM, and 880KB 3.5-inch disk drive, for US$1,295.[36]

Somewhat larger and more expensive systems (for example, running CP/M), or sometimes a home computer with additional interfaces and devices, although still low-cost compared with minicomputers and mainframes, were aimed at office and small business use, typically using "high resolution" monitors capable of at least 80 column text display, and often no graphical or color drawing capability. Workstations were characterized by high-performance processors and graphics displays, with large-capacity local disk storage, networking capability, and running under a multitasking operating system. Eventually, due to the influence of the IBM PC on the personal computer market, personal computers and home computers lost any technical distinction. Business computers acquired color graphics capability and sound, and home computers and game systems users used the same processors and operating systems as office workers. Mass-market computers had graphics capabilities and memory comparable to dedicated workstations of a few years before. Even local area networking, originally a way to allow business computers to share expensive mass storage and peripherals, became a standard feature of personal computers used at home.

In 1982 "The Computer" was named Machine of the Year by Time magazine. In the 2010s, several companies such as Hewlett-Packard and Sony sold off their PC and laptop divisions. As a result, the personal computer was declared dead several times during this period.[37]

A workstation is a high-end personal computer designed for technical, mathematical, or scientific applications. Intended primarily to be used by one person at a time, they are commonly connected to a local area network and run multi-user operating systems. Workstations are used for tasks such as computer-aided design, drafting and modeling, computation-intensive scientific and engineering calculations, image processing, architectural modeling, and computer graphics for animation and motion picture visual effects.[38]

Prior to the widespread usage of PCs, a computer that could fit on a desk was remarkably small, leading to the "desktop" nomenclature. More recently, the phrase usually indicates a particular style of computer case. Desktop computers come in a variety of styles ranging from large vertical tower cases to small models which can be tucked behind an LCD monitor. In this sense, the term "desktop" refers specifically to a horizontally oriented case, usually intended to have the display screen placed on top to save desk space. Most modern desktop computers have an external display screen and an external keyboard, which are typically plugged into the computer case.

A gaming computer is a standard desktop computer that typically has high-performance hardware, such as a more powerful video card, processor and memory, in order to handle the requirements of demanding video games, which are often simply called "PC games".[39] A number of companies, such as Alienware, manufacture prebuilt gaming computers, and companies such as Razer and Logitech market mice, keyboards and headsets geared toward gamers.

Single-unit PCs (also known as all-in-one PCs) are a subtype of desktop computers that combine the monitor and case of the computer within a single unit. The monitor often utilizes a touchscreen as an optional method of user input, but separate keyboards and mice are normally still included. The inner components of the PC are often located directly behind the monitor and many of such PCs are built similarly to laptops.

A subtype of desktops, called nettops, was introduced by Intel in February 2008, characterized by low cost and lean functionality. A similar subtype of laptops (or notebooks) is the netbook, described below. The product line features the new Intel Atom processor, which specifically enables nettops to consume less power and fit into small enclosures.

A home theater PC (HTPC) is a convergence device that combines the functions of a personal computer and a digital video recorder. It is connected to a TV set or an appropriately sized computer display, and is often used as a digital photo viewer, music and video player, TV receiver, and digital video recorder. HTPCs are also referred to as media center systems or media servers. The general goal in a HTPC is usually to combine many or all components of a home theater setup into one box. More recently, HTPCs gained the ability to connect to services providing on-demand movies and TV shows. HTPCs can be purchased pre-configured with the required hardware and software needed to add television programming to the PC, or can be cobbled together out of discrete components, what is commonly done with software support from MythTV, Windows Media Center, GB-PVR, SageTV, Famulent or LinuxMCE.

A laptop computer, also called a notebook, is a small personal computer designed for portability. Usually, all of the hardware and interfaces needed to operate a laptop, such as the graphics card, audio devices or USB ports (previously parallel and serial ports), are built into a single unit. Laptops usually have "clamshell" design, in which the keyboard and computer components are on one panel and a flat display screen on a second panel, which is hinged to the first panel. The laptop is opened for use and closed for transport. Closing the laptop also protects the screen and keyboard during transportation. Laptops have both a power cable that can be plugged in and high-capacity batteries that can power the device, enhancing its portability. Once the battery charge is depleted, it will have to be recharged through a power outlet. In the interests of saving power, weight and space, laptop graphics cards are in many cases integrated into the CPU or chipset and use system RAM, resulting in reduced graphics performance when compared to an equivalent desktop machine. For this reason, desktop or gaming computers are usually preferred to laptop PCs for gaming purposes.

One of the drawbacks of laptops is that, due to the size and configuration of components, usually relatively little can be done to upgrade the overall computer from its original design or add components. Internal upgrades are either not manufacturer-recommended, can damage the laptop if done with poor care or knowledge, or in some cases impossible, making the desktop PC more modular and upgradable. Desktop PCs typically have a case that has extra empty space inside, where users can install new components. Some internal upgrades to laptops, such as memory and hard disk drive upgrades are often easily performed, while a display or keyboard upgrade is usually difficult or impossible. Just like desktops, laptops also have the same input and output ports for connecting to a wide variety of devices, including external displays, mice, cameras, storage devices and keyboards, which may be attached externally through USB ports and other less common ports such as external video. Laptops are also a little more expensive compared to desktops, as the miniaturized components for laptops themselves are expensive.

A subtype of notebooks, called subnotebook, has most of the features of a standard laptop computer, but with smaller physical dimensions. Subnotebooks are larger than hand-held computers, and usually run full versions of desktop or laptop operating systems. Ultra-Mobile PCs (UMPC) are usually considered subnotebooks, or more specifically, subnotebook tablet PCs, which are described below. Netbooks are sometimes considered to belong to this category, though they are sometimes separated into a category of their own (see below).

A desktop replacement computer (DTR) is a personal computer that provides the full capabilities of a desktop computer while remaining mobile. Such computers are often actually larger, bulkier laptops. Because of their increased size, this class of computers usually includes more powerful components and a larger display than generally found in smaller portable computers, and can have a relatively limited battery capacity or none at all in some cases. Some use a limited range of desktop components to provide better performance at the expense of battery life. Desktop replacement computers are sometimes called desknotes, as a portmanteau of words "desktop" and "notebook", though the term is also applied to desktop replacement computers in general.[40]

Netbooks, also called mini notebooks or subnotebooks, are a subgroup of laptops[41] acting as a category of small, lightweight and inexpensive laptop computers suited for general computing tasks and accessing web-based applications. They are often marketed as "companion devices", with an intention to augment other ways in which a user can access computer resources.[41]Walt Mossberg called them a "relatively new category of small, light, minimalist and cheap laptops."[42] By August 2009, CNET called netbooks "nothing more than smaller, cheaper notebooks."[41] Initially, the primary defining characteristic of netbooks was the lack of an optical disc drive, requiring it to be a separate external device. This has become less important as flash memory devices have gradually increased in capacity, replacing the writable optical disc (e.g. CD-RW, DVD-RW) as a transportable storage medium.

At their inception in late 2007as smaller notebooks optimized for low weight and low cost[43]netbooks omitted key features (e.g., the optical drive), featured smaller screens and keyboards, and offered reduced specifications and computing power. Over the course of their evolution, netbooks have ranged in their screen sizes from below five inches[44] to over 13 inches,[45] with weights around ~1 kg (23 pounds). Often significantly less expensive than other laptops,[46] by mid-2009 netbooks had been offered to users "free of charge", with an extended service contract purchase of a cellular data plan.[47] In the short period since their appearance, netbooks have grown in size and features, converging with new smaller and lighter notebooks. By mid-2009, CNET noted that "the specs are so similar that the average shopper would likely be confused as to why one is better than the other," noting "the only conclusion is that there really is no distinction between the devices."[41]

A tablet is a type of portable PC that de-emphasizes the use of traditional input devices (such as a mouse or keyboard) by using a touchscreen display, which can be controlled using either a stylus pen or finger. Some tablets may use a "hybrid" or "convertible" design, offering a keyboard that can either be removed as an attachment, or a screen that can be rotated and folded directly over top the keyboard. Some tablets may run a traditional PC operating system such as Windows or Linux; Microsoft attempted to enter the tablet market in 2002 with its Microsoft Tablet PC specifications, for tablets and convertible laptops running Windows XP. However, Microsoft's early attempts were overshadowed by the release of Apple's iPad; following in its footsteps, most modern tablets use slate designs and run mobile operating systems such as Android and iOS, giving them functionality similar to smartphones. In response, Microsoft built its Windows 8 operating system to better accommodate these new touch-oriented devices.[48] Many tablet computers have USB ports, to which a keyboard or mouse can be connected.

The ultra-mobile PC (UMP) is a specification for small-configuration tablet PCs. It was developed as a joint development exercise by Microsoft, Intel and Samsung, among others. Current UMPCs typically feature the Windows XP, Windows Vista, Windows7, or Linux operating system, and low-voltage Intel Atom or VIA C7-M processors.

A pocket PC is a hardware specification for a handheld-sized computer (personal digital assistant, PDA) that runs the Microsoft Windows Mobile operating system. It may have the capability to run an alternative operating system like NetBSD or Linux. Pocket PCs have many of the capabilities of modern desktop PCs. Numerous applications are available for handhelds adhering to the Microsoft Pocket PC specification, many of which are freeware. Some of these devices also include mobile phone features, actually representing a smartphone. Microsoft-compliant Pocket PCs can also be used with many other add-ons like GPS receivers, barcode readers, RFID readers and cameras. In 2007, with the release of Windows Mobile 6, Microsoft dropped the name Pocket PC in favor of a new naming scheme: devices without an integrated phone are called Windows Mobile Classic instead of Pocket PC, while devices with an integrated phone and a touch screen are called Windows Mobile Professional.[49]

Computer hardware is a comprehensive term for all physical parts of a computer, as distinguished from the data it contains or operates on, and the software that provides instructions for the hardware to accomplish tasks. The boundary between hardware and software has become blurred, with the existence of firmware that is software "built into" the hardware. For example, a 2010-era LCD display screen contains a small computer inside. Mass-market consumer computers use highly standardized components and so are simple for an end user to assemble into a working system. Most 2010s-era computers only require users to plug in the power supply, monitor, and other cables. A typical desktop computer consists of a computer case (or "tower"), a metal chassis that holds the power supply, motherboard, hard disk drive, and often an optical disc drive. Most towers have empty space where users can add additional components. External devices such as a computer monitor or visual display unit, keyboard, and a pointing device (mouse) are usually found in a personal computer.

The motherboard connects all processor, memory and peripheral devices together. The RAM, graphics card and processor are in most cases mounted directly onto the motherboard. The central processing unit (microprocessor chip) plugs into a CPU socket, while the memory modules plug into corresponding memory sockets. Some motherboards have the video display adapter, sound and other peripherals integrated onto the motherboard, while others use expansion slots for graphics cards, network cards, or other I/O devices. The graphics card or sound card may employ a break out box to keep the analog parts away from the electromagnetic radiation inside the computer case. Disk drives, which provide mass storage, are connected to the motherboard with one cable, and to the power supply through another cable. Usually, disk drives are mounted in the same case as the motherboard; expansion chassis are also made for additional disk storage.

For large amounts of data, a tape drive can be used or extra hard disks can be put together in an external case. The keyboard and the mouse are external devices plugged into the computer through connectors on an I/O panel on the back of the computer case. The monitor is also connected to the input/output (I/O) panel, either through an onboard port on the motherboard, or a port on the graphics card. Capabilities of the personal computers hardware can sometimes be extended by the addition of expansion cards connected via an expansion bus. Standard peripheral buses often used for adding expansion cards in personal computers include PCI, PCI Express (PCIe), and AGP (a high-speed PCI bus dedicated to graphics adapters, found in older computers). Most modern personal computers have multiple physical PCI Express expansion slots, with some of the having PCI slots as well.

A computer case is an enclosure that contains the main components of a computer. They are usually constructed from steel or aluminum combined with plastic, although other materials such as wood have been used for specialized units. Cases are available in different sizes and shapes; the size and shape of a computer case is usually determined by the configuration of the motherboard that it is designed to accommodate, since this is the largest and most central component of most computers. The most popular style for desktop computers is ATX, although microATX and similar layouts became very popular for a variety of uses. Companies like Shuttle Inc. and AOpen have popularized small cases, for which FlexATX is the most common motherboard size. In the 1990s, desktop computer cases were larger and taller than 2010-era computer cases.

The power supply unit (PSU) converts general-purpose mains AC electricity to direct current (DC) for the other components of the computer. The rated output capacity of a PSU should usually be about 40% greater than the calculated system power consumption needs obtained by adding up all the system components. This protects against overloading the supply, and guards against performance degradation.

The central processing unit, or CPU, is a part of a computer that executes instructions of a software program. In newer PCs, the CPU contains over a million transistors in one integrated circuit chip called the microprocessor. In most cases, the processor plugs directly into the motherboard. The chip generates so much heat that the PC builder is required to attach a special cooling device to its surface; thus, modern CPUs are equipped with a fan attached via heat sink. IBM PC compatible computers use an x86-compatible microprocessor, manufactured by Intel, AMD, VIA Technologies or Transmeta. Apple Macintosh computers were initially built with the Motorola 680x0 family of processors, then switched to the PowerPC series; in 2006, they switched to x86-compatible processors made by Intel.

The motherboard, also referred to as system board or main board, is the primary circuit board within a personal computer, and other major system components plug directly into it or via a cable. A motherboard contains a microprocessor, the CPU supporting circuitry (mostly integrated circuits) that provide the interface between memory and input/output peripheral circuits, main memory, and facilities for initial setup of the computer immediately after power-on (often called boot firmware or, in IBM PC compatible computers, a BIOS or UEFI). In many portable and embedded personal computers, the motherboard houses nearly all of the PC's core components. Often a motherboard will also contain one or more peripheral buses and physical connectors for expansion purposes. Sometimes a secondary daughter board is connected to the motherboard to provide further expandability or to satisfy space constraints.

A PC's main memory is a fast primary storage device that is directly accessible by the CPU, and is used to store the currently executing program and immediately needed data. PCs use semiconductor random-access memory (RAM) of various kinds such as DRAM, SDRAM or SRAM as their primary storage. Which exact kind is used depends on cost/performance issues at any particular time. Main memory is much faster than mass storage devices like hard disk drives or optical discs, but is usually volatile, meaning that it does not retain its contents (instructions or data) in the absence of power, and is much more expensive for a given capacity than is most mass storage. As a result, main memory is generally not suitable for long-term or archival data storage.

Mass storage devices store programs and data even when the power is off; they do require power to perform read and write functions during usage. Although flash memory has dropped in cost, the prevailing form of mass storage in personal computers is still the hard disk drive. If the mass storage controller provides additional ports for expandability, a PC may also be upgraded by the addition of extra hard disk or optical disc drives. For example, BD-ROMs, DVD-RWs, and various optical disc recorders may all be added by the user to certain PCs. Standard internal storage device connection interfaces are PATA, Serial ATA and SCSI. Solid state drives (SSDs) are a much faster replacement for traditional mechanical hard disk drives, but are also more expensive in terms of cost per gigabyte.

A visual display unit, computer monitor or just display, is a piece of electrical equipment, usually separate from the computer case, which displays visual images without producing a permanent computer record. A display device was usually either a CRT in the 1980s, but by the 2000s, flat panel displays such as a TFT LCD had largely replaced the bulkier, heavier CRT screens. Multi-monitor setups are quite common in the 2010s, as they enable a user to display multiple programs at the same time (e.g., an email inbox and a word processing program). The display unit houses an electronic circuitry that generates its picture from signals received from the computer. Within the computer, either integral to the motherboard or plugged into it as an expansion card, there is pre-processing circuitry to convert the microprocessor's output data to a format compatible with the display unit's circuitry. The images from computer monitors originally contained only text, but as graphical user interfaces emerged and became common, they began to display more images and multimedia content. The term "monitor" is also used, particularly by technicians in broadcasting television, where a picture of the broadcast data is displayed to a highly standardized reference monitor for confidence checking purposes.

The video cardotherwise called a graphics card, graphics adapter or video adapterprocesses the graphics output from the motherboard and transmits it to the display. It is an essential part of modern multimedia-enriched computing. On older models, and today on budget models, graphics circuitry may be integrated with the motherboard, but for modern and flexible machines, they are connected by the PCI, AGP, or PCI Express interface. When the IBM PC was introduced, most existing business-oriented personal computers used text-only display adapters and had no graphics capability. Home computers at that time had graphics compatible with television signals, but with low resolution by modern standards owing to the limited memory available to the eight-bit processors available at the time.

A keyboard is an arrangement of buttons that each correspond to a function, letter, or number. They are the primary devices used for inputting text. In most cases, they contain an array of keys specifically organized with the corresponding letters, numbers, and functions printed or engraved on the button. They are generally designed around an operators language, and many different versions for different languages exist. In English, the most common layout is the QWERTY layout, which was originally used in typewriters. They have evolved over time, and have been modified for use in computers with the addition of function keys, number keys, arrow keys, and keys specific to an operating system. Often, specific functions can be achieved by pressing multiple keys at once or in succession, such as inputting characters with accents or opening a task manager. Programs use keyboard shortcuts very differently and all use different keyboard shortcuts for different program specific operations, such as refreshing a web page in a web browser or selecting all text in a word processor. In addition to the alphabetic keys found on a typewriter, computer keyboards typically have a numeric keyboard and a row of function keys and special keys, such as CNTRL, ALT, DEL and Esc.

A computer mouse is a small handheld device that users hold and slide across a flat surface, pointing at various elements of a graphical user interface with an on-screen cursor, and selecting and moving objects using the mouse buttons. Almost all modern personal computers include a mouse; it may be plugged into a computer's rear mouse socket, or as a USB device, or, more recently, may be connected wirelessly via an USB dongle or Bluetooth link. In the past, mice had a single button that users could press down on the device to "click" on whatever the pointer on the screen was hovering over. Modern mice have two, three or more buttons, providing a "right click" function button on the mouse, which performs a secondary action on a selected object, and a scroll wheel, which users can rotate using their fingers to "scroll" up or down. The scroll wheel can also be pressed down, and therefore be used as a third button. Some mouse wheels may be tilted from side to side to allow sideways scrolling. Different programs make use of these functions differently, and may scroll horizontally by default with the scroll wheel, open different menus with different buttons, etc. These functions may be also user-defined through software utilities. Mice traditionally detected movement and communicated with the computer with an internal "mouse ball", and used optical encoders to detect rotation of the ball and tell the computer where the mouse has moved. However, these systems were subject to low durability, accuracy and required internal cleaning. Modern mice use optical technology to directly trace movement of the surface under the mouse and are much more accurate, durable and almost maintenance free. They work on a wider variety of surfaces and can even operate on walls, ceilings or other non-horizontal surfaces.

All computers require either fixed or removable storage for their operating system, programs and user-generated material. Early home computers used compact audio cassettes for file storage; these were at the time a very low cost storage solution, but were displaced by floppy disk drives when manufacturing costs dropped, by the mid-1980s. Initially, the 5.25-inch and 3.5-inch floppy drives were the principal forms of removable storage for backup of user files and distribution of software. As memory sizes increased, the capacity of the floppy did not keep pace; the Zip drive and other higher-capacity removable media were introduced but never became as prevalent as the floppy drive. By the late 1990s, the optical drive, in CD and later DVD and Blu-ray Disc forms, became the main method for software distribution, and writeable media provided means for data backup and file interchange. As a result, floppy drives became uncommon in desktop personal computers since about 2000, and were dropped from many laptop systems even earlier.[note 1]

A second generation of tape recorders was provided when videocassette recorders were pressed into service as backup media for larger disk drives. All these systems were less reliable and slower than purpose-built magnetic tape drives. Such tape drives were uncommon in consumer-type personal computers but were a necessity in business or industrial use. Interchange of data such as photographs from digital cameras is greatly expedited by installation of a card reader, which is often compatible with several forms of flash memory devices. It is usually faster and more convenient to move large amounts of data by removing the card from the mobile device, instead of communicating with the mobile device through a USB interface.

A USB flash drive performs much of the data transfer and backup functions formerly done with floppy drives, Zip disks and other devices. Mainstream operating systems for personal computers provide built-in support for USB flash drives, allowing interchange even between computers with different processors and operating systems. The compact size and lack of moving parts or dirt-sensitive media, combined with low cost and high capacity, have made USB flash drives a popular and useful accessory for any personal computer user.

The operating system can be located on any storage, but is typically installed on a hard disk or solid-state drive. A Live CD represents the concept of running an operating system directly from a CD. While this is slow compared to storing the operating system on a hard disk drive, it is typically used for installation of operating systems, demonstrations, system recovery, or other special purposes. Large flash memory is currently more expensive than hard disk drives of similar size (as of mid-2014) but are starting to appear in laptop computers because of their low weight, small size and low power requirements. Computer communications involve internal modem cards, modems, network adapter cards, and routers. Common peripherals and adapter cards include headsets, joysticks, microphones, printers, scanners, sound adapter cards (as a separate card rather than located on the motherboard), speakers and webcams.

Computer software is any kind of computer program, procedure, or documentation that performs some task on a computer system.[51] The term includes application software such as word processors that perform productive tasks for users, system software such as operating systems that interface with computer hardware to provide the necessary services for application software, and middleware that controls and co-ordinates distributed systems.

Software applications are common for word processing, Internet browsing, Internet faxing, e-mail and other digital messaging, multimedia playback, playing of computer game, and computer programming. The user of a modern personal computer may have significant knowledge of the operating environment and application programs, but is not necessarily interested in programming nor even able to write programs for the computer. Therefore, most software written primarily for personal computers tends to be designed with simplicity of use, or "user-friendliness" in mind. However, the software industry continuously provide a wide range of new products for use in personal computers, targeted at both the expert and the non-expert user.

An operating system (OS) manages computer resources and provides programmers with an interface used to access those resources. An operating system processes system data and user input, and responds by allocating and managing tasks and internal system resources as a service to users and programs of the system. An operating system performs basic tasks such as controlling and allocating memory, prioritizing system requests, controlling input and output devices, facilitating computer networking, and managing files.

Common contemporary desktop operating systems are Microsoft Windows, macOS, Linux, Solaris and FreeBSD. Windows, macOS, and Linux all have server and personal variants. With the exception of Microsoft Windows, the designs of each of them were inspired by or directly inherited from the Unix operating system, which was developed at Bell Labs beginning in the late 1960s and spawned the development of numerous free and proprietary operating systems.

Microsoft Windows is the collective brand name of several operating systems made by Microsoft which, as of 2015, are installed on PCs built by HP, Dell and Lenovo, the three remaining high volume manufacturers.[52] Microsoft first introduced an operating environment named Windows in November 1985,[53] as an add-on to MS-DOS and in response to the growing interest in graphical user interfaces (GUIs)[54][55] generated by Apple's 1984 introduction of the Macintosh.[56] As of January 2017[update], the most recent client and server version of Windows are Windows 10 and Windows Server 2016.

macOS (formerly OSX) is a line of operating systems developed, marketed and sold by Apple Inc. macOS is the successor to the original Mac OS, which had been Apple's primary operating system since 1984. macOS is a Unix-based graphical operating system, and Snow Leopard, Leopard, Lion, Mountain Lion, Mavericks, Yosemite and El Capitan are its version codenames. The most recent version of macOS is codenamed macOS Sierra.

On iPhone, iPad and iPod, versions of iOS (which is an OSX derivative) are available from iOS1.0 to the recent iOS10. The iOS devices, however, are not considered PCs.

Linux is a family of Unix-like computer operating systems. Linux is one of the most prominent examples of free software and open source development: typically all underlying source code can be freely modified, used, and redistributed by anyone.[57] The name "Linux" refers to the Linux kernel, started in 1991 by Linus Torvalds. The system's utilities and libraries usually come from the GNU operating system, announced in 1983 by Richard Stallman. The GNU contribution is the basis for the alternative name GNU/Linux.[58]

Known for its use in servers, with the LAMP application stack as one of prominent examples, Linux is supported by corporations such as Dell, Hewlett-Packard, IBM, Novell, Oracle Corporation, Red Hat, Canonical Ltd. and Sun Microsystems. It is used as an operating system for a wide variety of computer hardware, including desktop computers, netbooks, supercomputers,[59] video game systems such as the Steam Machine or PlayStation3 (until this option was removed remotely by Sony in 2010[60]), several arcade games, and embedded devices such as mobile phones, portable media players, routers, and stage lighting systems.

Generally, a computer user uses application software to carry out a specific task. System software supports applications and provides common services such as memory management, network connectivity and device drivers, all of which may be used by applications but are not directly of interest to the end user. A simplified analogy in the world of hardware would be the relationship of an electric light bulb (an application) to an electric power generation plant (a system): the power plant merely generates electricity, not itself of any real use until harnessed to an application like the electric light that performs a service that benefits the user.

Typical examples of software applications are word processors, spreadsheets, and media players. Multiple applications bundled together as a package are sometimes referred to as an application suite. Microsoft Office and LibreOffice, which bundle together a word processor, a spreadsheet, and several other discrete applications, are typical examples. The separate applications in a suite usually have a user interface that has some commonality making it easier for the user to learn and use each application. Often, they may have some capability to interact with each other in ways beneficial to the user; for example, a spreadsheet might be able to be embedded in a word processor document even though it had been created in the separate spreadsheet application.

End-user development tailors systems to meet the user's specific needs. User-written software include spreadsheet templates, word processor macros, scientific simulations, graphics and animation scripts; even email filters are a kind of user software. Users create this software themselves and often overlook how important it is.

PC gaming is popular among the high-end PC market. According to an April 2014 market analysis, Gaming platforms like Steam (software), Uplay, Origin, and GOG.com (as well as competitive eSports titles like League of Legends) are largely responsible for PC systems overtaking console revenue in 2013.[61]

In 2001, 125 million personal computers were shipped in comparison to 48,000 in 1977.[62] More than 500 million personal computers were in use in 2002 and one billion personal computers had been sold worldwide from the mid-1970s up to this time. Of the latter figure, 75% were professional or work related, while the rest were sold for personal or home use. About 81.5% of personal computers shipped had been desktop computers, 16.4% laptops and 2.1% servers. The United States had received 38.8% (394 million) of the computers shipped, Europe 25% and 11.7% had gone to the Asia-Pacific region, the fastest-growing market as of 2002. The second billion was expected to be sold by 2008.[63] Almost half of all households in Western Europe had a personal computer and a computer could be found in 40% of homes in United Kingdom, compared with only 13% in 1985.[64]

The global personal computer shipments were 350.9 million units in 2010,[65] 308.3 million units in 2009[66] and 302.2 million units in 2008.[67][68] The shipments were 264 million units in the year 2007, according to iSuppli,[69] up 11.2% from 239 million in 2006.[70] In 2004, the global shipments were 183 million units, an 11.6% increase over 2003.[71] In 2003, 152.6 million computers were shipped, at an estimated value of $175 billion.[72] In 2002, 136.7 million PCs were shipped, at an estimated value of $175 billion.[72] In 2000, 140.2 million personal computers were shipped, at an estimated value of $226 billion.[72] Worldwide shipments of personal computers surpassed the 100-million mark in 1999, growing to 113.5 million units from 93.3 million units in 1998.[73] In 1999, Asia had 14.1 million units shipped.[74]

As of June 2008, the number of personal computers in use worldwide hit one billion,[75] while another billion is expected to be reached by 2014. Mature markets like the United States, Western Europe and Japan accounted for 58% of the worldwide installed PCs. The emerging markets were expected to double their installed PCs by 2012 and to take 70% of the second billion PCs. About 180 million computers (16% of the existing installed base) were expected to be replaced and 35 million to be dumped into landfill in 2008. The whole installed base grew 12% annually.[76][77]

Based on International Data Corporation (IDC) data for Q2 2011, for the first time China surpassed US in PC shipments by 18.5 million and 17.7 million respectively. This trend reflects the rising of emerging markets as well as the relative stagnation of mature regions.[78]

In the developed world, there has been a vendor tradition to keep adding functions to maintain high prices of personal computers. However, since the introduction of the One Laptop per Child foundation and its low-cost XO-1 laptop, the computing industry started to pursue the price too. Although introduced only one year earlier, there were 14 million netbooks sold in 2008.[79] Besides the regular computer manufacturers, companies making especially rugged versions of computers have sprung up, offering alternatives for people operating their machines in extreme weather or environments.[80]

In 2011, Deloitte consulting firm predicted that, smartphones and tablet computers as computing devices would surpass the PCs sales[83] (as has happened since 2012). As of 2013, worldwide sales of PCs had begun to fall as many consumers moved to tablets and smartphones for gifts and personal use. Sales of 90.3 million units in the 4th quarter of 2012 represented a 4.9% decline from sales in the 4th quarter of 2011.[84] Global PC sales fell sharply in the first quarter of 2013, according to IDC data. The 14% year-over-year decline was the largest on record since the firm began tracking in 1994, and double what analysts had been expecting.[85][86] The decline of Q2 2013 PC shipments marked the fifth straight quarter of falling sales.[87] "This is horrific news for PCs," remarked an analyst. "It's all about mobile computing now. We have definitely reached the tipping point."[85] Data from Gartner Inc. showed a similar decline for the same time period.[85] China's Lenovo Group bucked the general trend as strong sales to first time buyers in the developing world allowed the company's sales to stay flat overall.[85]Windows 8, which was designed to look similar to tablet/smartphone software, was cited as a contributing factor in the decline of new PC sales. "Unfortunately, it seems clear that the Windows 8 launch not only didnt provide a positive boost to the PC market, but appears to have slowed the market," said IDC Vice President Bob ODonnell.[86]

In August 2013, Credit Suisse published research findings that attributed around 75% of the operating profit share of the PC industry to Microsoft (operating system) and Intel (semiconductors).[88] According to IDC, in 2013 PC shipments dropped by 9.8% as the greatest drop-ever in line with consumers trends to use mobile devices.[89]

Selling prices of personal computers steadily declined due to lower costs of production and manufacture, while the capabilities of computers increased. In 1975, an Altair kit sold for only around US$400, but required customers to solder components into circuit boards; peripherals required to interact with the system in alphanumeric form instead of blinking lights would add another $2,000, and the resultant system was only of use to hobbyists.[90]

At their introduction in 1981, the US$1,795 price of the Osborne 1 and its competitor Kaypro was considered an attractive price point; these systems had text-only displays and only floppy disks for storage. By 1982, Michael Dell observed that a personal computer system selling at retail for about $3,000 US was made of components that cost the dealer about $600; typical gross margin on a computer unit was around $1,000.[91] The total value of personal computer purchases in the US in 1983 was about $4 billion, comparable to total sales of pet food. By late 1998, the average selling price of personal computer systems in the United States had dropped below $1,000.[92]

For Microsoft Windows systems, the average selling price (ASP) showed a decline in 2008/2009, possibly due to low-cost netbooks, drawing $569 for desktop computers and $689 for laptops at U.S. retail in August 2008. In 2009, ASP had further fallen to $533 for desktops and to $602 for notebooks by January and to $540 and $560 in February.[93] According to research firm NPD, the average selling price of all Windows portable PCs has fallen from $659 in October 2008 to $519 in October 2009.[94]

Personal computing can fulfill individual needs, but that fulfillment may come at a cost to society as well, especially in terms of environmental impact, although this impact differs between desktop computers and laptops.[95] Toxic chemicals found in some computer hardware include lead, mercury, cadmium, chromium, plastic (PVC), and barium. Overall, a computer is about 17% lead, copper, zinc, mercury, and cadmium; 23% is plastic, 14% is aluminum, and 20% is iron.[citation needed] Lead is found in a cathode ray tube (CRT) display, and on all of the printed circuit boards and most expansion cards.[citation needed] Mercury may be present in an LCD screen's fluorescent lamp backlight. Plastic is found mostly in the housing of the computation and display circuitry. While daily end-users are not exposed to these toxic elements, the danger arises during the computer recycling process, which involves manually breaking down hardware and leads to the exposure of a measurable amount of lead or mercury. A measurable amount of lead or mercury can easily cause serious brain damage or ruin drinking water supplies. Computer recycling is best handled by the electronic waste (e-waste) industry, and kept segregated from the general community dump.

Personal computers have become a large contributor to the 50 million tons of discarded electronic waste that is being generated annually, according to the United Nations Environment Programme. To address the electronic waste issue affecting developing countries and the environment, extended producer responsibility (EPR) acts have been implemented in various countries and states.[96] Organizations, such as the Silicon Valley Toxics Coalition, Basel Action Network, Toxics Link India, SCOPE, and Greenpeace have contributed to these efforts. In the absence of comprehensive national legislation or regulation on the export and import of electronic waste, the Silicon Valley Toxics Coalition and BAN (Basel Action Network) teamed up with 32 electronic recyclers in the US and Canada to create an e-steward program for the orderly disposal of manufacturers and customers electronic waste. The Silicon Valley Toxics Coalition founded the Electronics TakeBack Coalition, a coalition that advocates for the production of environmentally friendly products. The TakeBack Coalition works with policy makers, recyclers, and smart businesses to get manufacturers to take full responsibility of their products. There are organizations opposing EPR regulation, such as the Reason Foundation. They see flaws in two principal tenets of EPR: First EPR relies on the idea that if the manufacturers have to pay for environmental harm, they will adapt their practices. Second EPR assumes the current design practices are environmentally inefficient. The Reason Foundation claims that manufacturers naturally move toward reduced material and energy use.

Read more:
Personal computer - Wikipedia

Written by grays |

July 30th, 2017 at 2:32 pm


Page 2,046«..1020..2,0452,0462,0472,048..2,0602,070..»



matomo tracker