Page 2,024«..1020..2,0232,0242,0252,026..2,0302,040..»

Enlightenment (spiritual) – Wikipedia

Posted: August 1, 2017 at 1:41 am


Enlightenment is the "full comprehension of a situation".[web 1] The term is commonly used to denote the Age of Enlightenment,[note 1] but is also used in Western cultures in a religious context. It translates several Buddhist terms and concepts, most notably bodhi,[note 2]kensho and satori. Related terms from Asian religions are moksha (liberation) in Hinduism, Kevala Jnana in Jainism, and ushta in Zoroastrianism.

In Christianity, the word "enlightenment" is rarely used, except to refer to the Age of Enlightenment and its influence on Christianity. Roughly equivalent terms in Christianity may be illumination, kenosis, metanoia, revelation, salvation and conversion.

Perennialists and Universalists view enlightenment and mysticism as equivalent terms for religious or spiritual insight.

The English term "enlightenment" has commonly been used to translate several Sanskrit, Pali,[web 2] Chinese and Japanese terms and concepts, especially bodhi, prajna, kensho, satori and buddhahood.

Bodhi is a Theravada term. It literally means "awakening" and "understanding". Someone who is awakened has gained insight into the workings of the mind which keeps us imprisoned in craving, suffering and rebirth,[web 1] and has also gained insight into the way that leads to nirvana, the liberation of oneself from this imprisonment.

Prajna is a Mahayana term. It refers to insight into our true nature, which according to Madhyamaka is empty of a personal essence in the stream of experience. But it also refers to the Tathgata-garbha or Buddha-nature, the essential basic-consciousness beyond the stream of experience.

In Zen, kensho means "seeing into one's true nature".Satori is often used interchangeably with kensho, but refers to the experience of kensho.

Buddhahood is the attainment of full awakening and becoming a Buddha. According to the Tibetan Thubten Yeshe,[web 3] enlightenment

[means] full awakening; buddhahood. The ultimate goal of Buddhist practice, attained when all limitations have been removed from the mind and one's positive potential has been completely and perfectly realized. It is a state characterized by infinite compassion, wisdom and skill.[web 4]

In Indian religions moksha (Sanskrit: moka; liberation) or mukti (Sanskrit: ; release both from the root muc "to let loose, let go") is the final extrication of the soul or consciousness (purusha) from samsara and the bringing to an end of all the suffering involved in being subject to the cycle of repeated death and rebirth (reincarnation).

Advaita Vedanta (IAST Advaita Vednta; Sanskrit: [dait ednt]) is a philosophical concept where followers seek liberation/release by recognizing identity of the Self (Atman) and the Whole (Brahman) through long preparation and training, usually under the guidance of a guru, that involves efforts such as knowledge of scriptures, renunciation of worldy activities, and inducement of direct identity experiences. Originating in India before 788 AD, Advaita Vedanta is widely considered the most influential and most dominant[web 5] sub-school of the Vednta (literally, end or the goal of the Vedas, Sanskrit) school of Hindu philosophy. Other major sub-schools of Vednta are Viishdvaita and Dvaita; while the minor ones include Suddhadvaita, Dvaitadvaita and Achintya Bhedabheda.

Advaita (literally, non-duality) is a system of thought where "Advaita" refers to the identity of the Self (Atman) and the Whole (Brahman).[note 3] Recognition of this identity leads to liberation. Attaining this liberation takes a long preparation and training under the guidance of a guru.

The key source texts for all schools of Vednta are the Prasthanatrayithe canonical texts consisting of the Upanishads, the Bhagavad Gita and the Brahma Sutras. The first person to explicitly consolidate the principles of Advaita Vedanta was Shankara Bhagavadpada, while the first historical proponent was Gaudapada, the guru of Shankara's guru Govinda Bhagavatpada.

Shankara systematized the works of preceding philosophers. His system of Vedanta introduced the method of scholarly exegesis on the accepted metaphysics of the Upanishads. This style was adopted by all the later Vedanta schools.[citation needed]

Shankara's synthesis of Advaita Vedanta is summarized in this quote from the Vivekacmai, one of his Prakaraa grathas (philosophical treatises):[note 4]

In half a couplet I state, what has been stated by crores of texts;

that is Brahman alone is real, the world is mithy (not independently existent),

In the 19th century Vivekananda played a major role in the revival of Hinduism, and the spread of Advaita Vedanta to the West via the Ramakrishna Mission. His interpretation of Advaita Vedanta has been called "Neo-Vedanta".

In a talk on "The absolute and manifestation" given in at London in 1896 Swami Vivekananda said,

I may make bold to say that the only religion which agrees with, and even goes a little further than modern researchers, both on physical and moral lines is the Advaita, and that is why it appeals to modern scientists so much. They find that the old dualistic theories are not enough for them, do not satisfy their necessities. A man must have not only faith, but intellectual faith too".[web 6]

Vivekananda emphasized samadhi as a means to attain liberation. Yet this emphasis is not to be found in the Upanishads nor in Shankara. For Shankara, meditation and Nirvikalpa Samadhi are means to gain knowledge of the already existing unity of Brahman and Atman, not the highest goal itself:

[Y]oga is a meditative exercise of withdrawal from the particular and identification with the universal, leading to contemplation of oneself as the most universal, namely, Consciousness. This approach is different from the classical yoga of complete thought suppression.

Vivekenanda's modernisation has been criticized:

Without calling into question the right of any philosopher to interpret Advaita according to his own understanding of it, [...] the process of Westernization has obscured the core of this school of thought. The basic correlation of renunciation and Bliss has been lost sight of in the attempts to underscore the cognitive structure and the realistic structure which according to Samkaracarya should both belong to, and indeed constitute the realm of my.

Neo-Advaita is a new religious movement based on a modern, Western interpretation of Advaita Vedanta, especially the teachings of Ramana Maharshi. Neo-Advaita is being criticized[note 6][note 7][note 8] for discarding the traditional prerequisites of knowledge of the scriptures and "renunciation as necessary preparation for the path of jnana-yoga". Notable neo-advaita teachers are H. W. L. Poonja, his students GangajiAndrew Cohen,[note 9], Madhukar[23] and Eckhart Tolle.

The prime means to reach moksha is through the practice of yoga (Sanskrit, Pli: , /j/, yoga) is a commonly known generic term for physical, mental, and spiritual disciplines which originated in ancient India. Specifically, yoga is one of the six stika ("orthodox") schools of Hindu philosophy. It is based on the Yoga Stras of Patajali. Various traditions of yoga are found in Hinduism, Buddhism, Jainism and Sikhism.[note 10]

Prephilosophical speculations and diverse ascetic practices of first millennium BCE were systematized into a formal philosophy in early centuries CE by the Yoga Sutras of Patanjali. By the turn of the first millennium, Hatha yoga emerged as a prominent tradition of yoga distinct from the Patanjali's Yoga Sutras. While the Yoga Sutras focus on discipline of the mind, Hatha yoga concentrates on health and purity of the body.

Hindu monks, beginning with Swami Vivekananda, brought yoga to the West in the late 19th century. In the 1980s, yoga became popular as a physical system of health exercises across the Western world. Many studies have tried to determine the effectiveness of yoga as a complementary intervention for cancer, schizophrenia, asthma and heart patients. In a national survey, long-term yoga practitioners in the United States reported musculoskeletal and mental health improvements.

Classical Advaita Vedanta emphasises the path of jnana yoga, a progression of study and training to attain moksha. It consists of four stages:[32][web 12]

The paths of bhakti yoga and karma yoga are subsidiary.

In bhakti yoga, practice centers on the worship God in any way and in any form, like Krishna or Ayyappa. Adi Shankara himself was a proponent of devotional worship or Bhakti. But Adi Shankara taught that while Vedic sacrifices, puja and devotional worship can lead one in the direction of jnana (true knowledge), they cannot lead one directly to moksha. At best, they can serve as means to obtain moksha via shukla gati.[citation needed]

Karma yoga is the way of doing our duties, in disregard of personal gains or losses. According to Sri Swami Sivananda,

Karma Yoga is consecration of all actions and their fruits unto the Lord. Karma Yoga is performance of actions dwelling in union with the Divine, removing attachment and remaining balanced ever in success and failure.

Karma Yoga is selfless service unto humanity. Karma Yoga is the Yoga of action which purifies the heart and prepares the Antahkarana (the heart and the mind) for the reception of Divine Light or attainment if Knowledge of the Self. The important point is that you will have to serve humanity without any attachment or egoism.[web 15]

Jainism (; Sanskrit: Jainadharma, Tamil: Samaam, Bengali: Jainadharma, Telugu: Jainamata, Malayalam: Jainmat, Kannada: Jaina dharma), is an Indian religion that prescribes a path of non-violence towards all living beings. Its philosophy and practice emphasize the necessity of self-effort to move the soul toward divine consciousness and liberation. Any soul that has conquered its own inner enemies and achieved the state of supreme being is called a jina ("conqueror" or "victor"). The ultimate status of these perfect souls is called siddha. Ancient texts also refer to Jainism as shramana dharma (self-reliant) or the "path of the nirganthas" (those without attachments or aversions).

In Jainism highest form of pure knowledge a soul can attain is called Kevala Jnana ( Sanskrit: )or Kevala a (Prakrit: ). which means absolute or perfect and Jna, which means "knowledge". Kevala is the state of isolation of the jva from the ajva attained through ascetic practices which burn off one's karmic residues, releasing one from bondage to the cycle of death and rebirth. Kevala Jna thus means infinite knowledge of self and non-self, attained by a soul after annihilation of the all ghtiy karmas. The soul which has reached this stage achieves moksa or liberation at the end of its life span.

Mahavira, 24th thirthankara of Jainism, is said to have practised rigorous austerities for 12 years before he attained enlightenment,

During the thirteenth year, in the second month of summer, in the fourth fortnight, the light (fortnight) of Vaisakha, on its tenth day, when the shadow had turned towards the east and the first wake was over, on the day called Suvrata, in the Muhurta called Vigaya, outside of the town Grimbhikagrama on the bank of the river Rjupalika, not far from an old temple, in the field of the householder Samaga, under a Sal tree, when the moon was in conjunction with the asterism Uttara Phalguni, (the Venerable One) in a squatting position with joined heels, exposing himself to the heat of the sun, after fasting two and a half days without drinking water, being engaged in deep meditation, reached the highest knowledge and intuition, called Kevala, which is infinite, supreme, unobstructed, unimpeded, complete, and full.[citation needed]

Kevala Jna is one of the five major events in the life of a Tirthankara and is known as Jna Kalyanaka and supposedly celebrated by all gods. Mahaviras Kaivalya was said to have been celebrated by the demi-gods, who constructed the Samosarana or a grand preaching assembly for him.

In the Western world the concept of enlightenment in a religious context acquired a romantic meaning. It has become synonymous with self-realization and the true self, which is being regarded as a substantial essence which is covered over by social conditioning.[note 12]

The use of the Western word enlightenment is based on the supposed resemblance of bodhi with Aufklrung, the independent use of reason to gain insight into the true nature of our world. As a matter of fact there are more resemblances with Romanticism than with the Enlightenment: the emphasis on feeling, on intuitive insight, on a true essence beyond the world of appearances.

The equivalent term "awakening" has also been used in a Christian context,[35] namely the Great Awakenings, several periods of religious revival in American religious history. Historians and theologians identify three or four waves of increased religious enthusiasm occurring between the early 18th century and the late 19th century. Each of these "Great Awakenings" was characterized by widespread revivals led by evangelical Protestant ministers, a sharp increase of interest in religion, a profound sense of conviction and redemption on the part of those affected, an increase in evangelical church membership, and the formation of new religious movements and denominations.

Another equivalent term is Illuminationism, which was also used by Paul Demieville in his work The Mirror of the Mind, in which he made a distinction between "illumination subie" and "illumination graduelle".[web 16] Illuminationism is a doctrine according to which the process of human thought needs to be aided by divine grace. It is the oldest and most influential alternative to naturalism in the theory of mind and epistemology.[37] It was an important feature of ancient Greek philosophy, Neoplatonism, medieval philosophy, and in particular, the Illuminationist school of Islamic philosophy.

Augustine was an important proponent of Illuminationism, stating that everything we know is taught to us by God as He casts His light over the world,[web 17] saying that "The mind needs to be enlightened by light from outside itself, so that it can participate in truth, because it is not itself the nature of truth. You will light my lamp, Lord [38] and "You hear nothing true from me which you have not first told me.[39] Augustine's version of illuminationism is not that God gives us certain information, but rather gives us insight into the truth of the information we received for ourselves.

This romantic idea of enlightenment as insight into a timeless, transcendent reality has been popularized especially by D.T. Suzuki.[web 18][web 19] Further popularization was due to the writings of Heinrich Dumoulin.[web 20] Dumoulin viewed metaphysics as the expression of a transcendent truth, which according to him was expressed by Mahayana Buddhism, but not by the pragmatic analysis of the oldest Buddhism, which emphasizes anatta. This romantic vision is also recognizable in the works of Ken Wilber.

In the oldest Buddhism this essentialism is not recognizable.[web 21] According to critics it doesn't really contribute to a real insight into Buddhism:[web 22]

...most of them labour under the old clich that the goal of Buddhist psychological analysis is to reveal the hidden mysteries in the human mind and thereby facilitate the development of a transcendental state of consciousness beyond the reach of linguistic expression.

A common reference in Western culture is the notion of "enlightenment experience". This notion can be traced back to William James, who used the term "religious experience" in his book, The Varieties of Religious Experience.Wayne Proudfoot traces the roots of the notion of "religious experience" further back to the German theologian Friedrich Schleiermacher (17681834), who argued that religion is based on a feeling of the infinite. The notion of "religious experience" was used by Schleiermacher to defend religion against the growing scientific and secular citique.

It was popularised by the Transcendentalists, and exported to Asia via missionaries. Transcendentalism developed as a reaction against 18th Century rationalism, John Locke's philosophy of Sensualism, and the predestinationism of New England Calvinism. It is fundamentally a variety of diverse sources such as Hindu texts like the Vedas, the Upanishads and the Bhagavad Gita, various religions, and German idealism.

It was adopted by many scholars of religion, of which William James was the most influential.[note 13]

The notion of "experience" has been criticised. Robert Sharf points out that "experience" is a typical Western term, which has found its way into Asian religiosity via western influences.[note 14] The notion of "experience" introduces a false notion of duality between "experiencer" and "experienced", whereas the essence of kensho is the realisation of the "non-duality" of observer and observed. "Pure experience" does not exist; all experience is mediated by intellectual and cognitive activity. The specific teachings and practices of a specific tradition may even determine what "experience" someone has, which means that this "experience" is not the proof of the teaching, but a result of the teaching. A pure consciousness without concepts, reached by "cleaning the doors of perception",[note 15] would be an overwhelming chaos of sensory input without coherence.

Nevertheless, the notion of religious experience has gained widespread use in the study of religion, and is extensively researched.

The word "enlightenment" is not generally used in Christian contexts for religious understanding or insight. More commonly used terms in the Christian tradition are religious conversion and revelation.

Lewis Sperry Chafer (18711952), one of the founders of Dispensationalism, uses the word "illuminism". Christians who are "illuminated" are of two groups, those who have experienced true illuminism (biblical) and those who experienced false illuminism (not from the Holy Spirit).

Christian interest in eastern spirituality has grown throughout the 20th century. Notable Christians, such as Hugo Enomiya-Lassalle and AMA Samy, have participated in Buddhist training and even become Buddhist teachers themselves. In a few places Eastern contemplative techniques have been integrated in Christian practices, such as centering prayer.[web 24] But this integration has also raised questions about the borders between these traditions.[web 25]

Western and Mediterranean culture has a rich tradition of esotericism and mysticism. The Perennial philosophy, basic to the New Age understanding of the world, regards those traditions as akin to Eastern religions which aim at awakening/ enlightenment and developing wisdom. The hypothesis that all mystical traditions share a "common core", is central to New Age, but contested by a diversity of scientists like Katz and Proudfoot.

Judaism includes the mystical tradition of Kabbalah. Islam includes the mystical tradition of Sufism. In the Fourth Way teaching, enlightenment is the highest state of Man (humanity).

A popular western understanding sees "enlightenment" as "nondual consciousness", "a primordial, natural awareness without subject or object".[web 26] It is used interchangeably with Neo-Advaita.

This nondual consciousness is seen as a common stratum to different religions. Several definitions or meanings are combined in this approach, which makes it possible to recognize various traditions as having the same essence. According to Renard, many forms of religion are based on an experiential or intuitive understanding of "the Real"

This idea of nonduality as "the central essence" is part of a modern mutual exchange and synthesis of ideas between western spiritual and esoteric traditions and Asian religious revival and reform movements.[note 16] Western predecessors are, among others, New Age,Wilber's synthesis of western psychology and Asian spirituality, the idea of a Perennial Philosophy, and Theosophy. Eastern influences are the Hindu reform movements such as Aurobindo's Integral Yoga and Vivekananda's Neo-Vedanta, the Vipassana movement, and Buddhist modernism. A truly syncretistic influence is Osho and the Rajneesh movement, a hybrid of eastern and western ideas and teachings, and a mainly western group of followers.

"Religious experiences" have "evidential value",[77] since they confirm the specific worldview of the experiencer:[78]

These experiences are cognitive in that, allegedly at least, the subject of the experience receives a reliable and accurate view of what, religiously considered, are the most important features of things. This, so far as their religious tradition is concerned, is what is most important about them. This is what makes them "salvific" or powerful to save.[79]

Yet, just like the very notion of "religious experience" is shaped by a specific discourse and habitus, the "uniformity of interpretation" may be due to the influence of religious traditions which shape the interpretation of such experiences.[78]

Yandell discerns various "religious experiences" and their corresponding doctrinal settings, which differ in structure and phenomenological content, and in the "evidential value" they present.[82] Yandell discerns five sorts:[83]

Various philosophers and cognitive scientists state that there is no "true self" or a "little person" (homunculus) in the brain that "watches the show," and that consciousness is an emergent property that arise from the various modules of the brain in ways that are yet far from understood.[90] According to Susan Greenfield, the "self" may be seen as a composite, whereas Douglas R. Hofstadter describes the sense of "I" as a result of cognitive process.

This is in line with the Buddhist teachings, which state that

[...] what we call 'I' or 'being,' is only a combination of physical and mental aggregates which are working together interdependently in a flux of momentary change within the law of cause and effect, and that there is nothing, permanent, everlasting, unchanging, and eternal in the whole of existence.

To this end, Parfit called Buddha the "first bundle theorist".

The idea that the mind is the result of the activities of neurons in the brain was most notably popularized by Francis Crick, the co-discoverer of DNA, in his book The Astonishing Hypothesis.[note 17] The basic idea can be traced back to at least tienne Bonnot de Condillac. According to Crick, the idea was not a novel one:

[...] an exceptionally clear statement of it can be found in a well known paper by Horace Barlow.

Several users of entheogens throughout the ages have claimed spiritual enlightenment with the use of these substances, their use and prevalence through history is well recorded, and continues today. In modern times we have seen increased interest in these practices, for example the rise of interest in Ayahuasca. The psychological effects of these substances have been subject to scientific research focused on understanding their physiological basis.

Read more here:
Enlightenment (spiritual) - Wikipedia

Written by grays |

August 1st, 2017 at 1:41 am

Posted in Enlightenment

From the Enlightenment to the Dark Ages: How new atheism slid … – Salon

Posted: at 1:41 am


The new atheist movement emerged shortly after the 9/11 attacks with a best-selling book by Sam Harris called The End of Faith. This was followed by engaging tomes authored by Richard Dawkins, Daniel Dennett and the late Christopher Hitchens, among others. Avowing to champion the values of science and reason, the movement offered a growing number of unbelievers tired of faith-based foolishness mucking up society for the rest of us some hope for the future. For many years I was among the new atheism movements greatest allies.

From the start, though, the movement had some curious quirks. Although many atheists are liberals and empirical studies link higher IQs to both liberalism and atheism, Hitchens gradually abandoned his Trotskyist political affiliations for what could, in my view, be best described as a neoconservative outlook. Indeed, he explicitly endorsed the 2003 U.S. invasion of Iraq, now widely seen as perhaps the greatest foreign policy blunder in American history.

There were also instances in which critiques of religion, most notably Islam, went beyond what was both intellectually warranted and strategically desirable. For example, Harris wrote in a 2004 Washington Times op-ed that We are at war with Islam. He added a modicum of nuance in subsequent sentences, but I know of no experts on Islamic terrorism who would ever suggest that uttering such a categorical statement in a public forum is judicious. As the terrorism scholar Will McCant noted in an interview that I conducted with him last year, there are circumstances in which certain phrases even if true are best not uttered, since they are unnecessarily incendiary. In what situation would claiming that the West is engaged in a civilizational clash with an entire religion actually improve the expected outcome?

Despite these peccadilloes, if thats what they are, new atheism still had much to offer. Yet the gaffes kept on coming, to the point that no rational person could simply dismiss them as noise in the signal. For example, Harris said in 2014 that new atheism was dominated by men because it lacks the nurturing, coherence-building extra estrogen vibe that you would want by default if you wanted to attract as many women as men.

This resulted in an exodus of women from the movement who decided that the new atheist label was no longer for them. (I know of many diehard atheist women who wantednothing to do with new atheism, which is a real shame.) Harris attempted self-exoneration didnt help, either it merely revealed a moral scotoma in his understanding of gender, sexism and related issues. What he should have done is, quite simply, said Im sorry. These words, I have come to realize, are nowhere to be found in the new atheist lexicon.

Subsequent statements about profiling at airports, serious allegations of rape at atheist conferences, and tweets from major leaders that (oops!) linked to white supremacist websites further alienated women, people of color and folks that one could perhaps describe as morally normal. Yet some of us mostly white men like myself persisted in our conviction that, overall, the new atheist movement was still a force for good in the world. It is an extraordinary personal embarrassment that I maintained this view until the present year.

For me, it was a series of recent events that pushed me over the edge. As a philosopher someone who cares deeply about intellectual honesty, verifiable evidence, critical thinking and moral thoughtfulness I now find myself in direct opposition with many new atheist leaders. That is, I see my own advocacy for science, critical thought and basic morality as standing in direct opposition to their positions.

Just consider a recent tweet from one of the most prominent new atheist luminaries, Peter Boghossian: Why is it that nearly every male whos a 3rd wave intersectional feminist is physically feeble & has terrible body habitus? If this is what it means to be a reasonable person, then who would want to be that? Except for the vocabulary, that looks like something youd find in Donald Trumps Twitter feed. The same goes for another of Boghossians deep thoughts: Ive never understood how someone could be proud of being gay. How can one be proud of something one didnt work for? Its hard to know where to even begin dissecting this bundle of shameful ignorance.

More recently, Boghossian and his sidekick James Lindsay published a hoax academic paper in a gender studies journal (except that it wasnt) in an attempt to embarrass the field of gender studies, which they having no expertise in the field believe is dominated by a radical feminist ideology that sees the penis as the root of all evil. Ive explained twice why this hoax actually just revealed a marked lack of skepticism among skeptics themselves, so I wont go further into the details here. Suffice it to say that while bemoaning the sloppy scholarship of gender studies scholars, Boghossian and Lindsays explanation of the hoax in a Skeptic article contained philosophical mistakes that a second-year undergraduate could detect. Even more, their argument for how the hoax paper exposes gender studies as a fraud contains a demonstrable fatal error that is, it gets a crucial fact wrong, thus rendering their argument unsound.

The point is this: One would expect skeptics, of all people, who claim to be responsive to the evidence, to acknowledge this factual error. Yet not a single leader of the new atheist movement has publicly mentioned the factual problems with the hoax. Had someone (or preferably all of them) done this, it would have affirmed the new atheist commitment to intellectual honesty, to putting truth before pride and epistemology before ideology, thereby restoring its damaged credibility.

Even worse, Boghossian and Lindsay explicitly argue, in response to some critics, that they dont need to know the field of gender studies to criticize it. This is, properly contextualized, about as anti-intellectual as one can get. Sure, it is a fallacy to immediately dismiss someones criticisms of a topic simply because that person doesnt have a degree on the topic. Doing this is called the Courtiers Reply. But it decidedly isnt a fallacy to criticize someone for being incredibly ignorant and even ignorant of their own ignorance regarding an issue theyre making strong, confident-sounding claims about. Kids, listen to me: Knowledge is a good thing, despite what Boghossian and Lindsay suggest, and you should always work hard to understand a position before you level harsh criticisms at it. Otherwise youll end up looking like a fool to those in the know.

Along these lines, the new atheist movement has flirted with misogyny for years. Harris estrogen vibe statement which yielded a defense rather than a gracious apology was only the tip of the iceberg. As mentioned above, there have been numerous allegations of sexual assault, and atheist conferences have pretty consistently been male-dominated resulting in something like a gender Matthew effect.

Many leading figures have recently allied themselves with small-time television personality Dave Rubin, a guy who has repeatedly given Milo Yiannopoulos the professional right-wing troll who once said that little boys would stop complaining about being raped by Catholic priests if the priests were as good-looking as he is a platform on his show. In a tweet from last May, Rubin said Id like a signed copy, please in response to a picture that reads: Ah. Peace and quiet. #ADayWithoutAWoman. If, say, Paul Ryan were asked, hed describe this as sort of like the textbook definition of a misogynistic comment. Did any new atheist leaders complain about this tweet? Of course not, much to the frustration of critical thinkers like myself who actually care about how women are treated in society.

In fact, the magazine Skeptic just published a glowing review of Yiannopoulos recent book, Dangerous. The great irony of this intellectual misstep is that Yiannopoulos embodies the opposite of nearly every trend of moral progress that Michael Shermer, the editor of Skeptic, identifies in his book The Moral Arc.

Yiannopoulos is a radical anti-intellectual, often ignoring facts or simply lying about issues; he uses hyperbolic rhetoric (e.g., feminism is cancer) that stymies rather than promotes rational discussion; he holds some outright racist views; he professes nonsensical views, such as the idea that birth control makes women unattractive and crazy; he uses hate speech, which indicates that hes not a very nice person; he once publicly called out a transgender student by name during a talk; and he supports Donald Trump, who has essentially led a society-wide campaign against the Enlightenment. Oh, and need I mention that Yiannopoulos once said that if it werent for his own experience of abuse by a Catholic priest, he never would have learned to give such good head? The merger between the alt-right and the new atheist movement continues to solidify.

Perhaps the most alarming instance of irrationality in recent memory, though, is Sam Harris recent claim that black people are less intelligent than white people. This emerged from a conversation that Harris had with Charles Murray, co-author of The Bell Curve and a monetary recipient of the racist Pioneer Fund. There are two issues worth dwelling upon here. The first is scientific: Despite what Harris asserts, science does not support the conclusion that there are gene-based IQ differences between the races. To confirm this, I emailed the leading psychologist Howard Gardner, who told me that The racial difference speculations of Herrnstein and Murray remain very controversial, as well as James Flynn (world-renowned for the Flynn effect), who responded that, Taking into account the range of evidence, I believe that black and white Americans are not distinguished by genes for IQ. However, the debate is ongoing.

The point is simply this: Scottish philosopher David Hume famously declared that the wise person always proportions her beliefs to the evidence. It follows that when a community of experts is divided on an issue, it behooves the rational non-expert to hold her opinion in abeyance. In direct opposition of this epistemic principle, Harris takes a firm stand on race and intelligence even receiving adulation for doing this from other white men in the new atheist community. A more thoughtful public intellectual would have said: Look, this is a very complicated issue that leading psychologists disagree about. A minority say there is a genetically based correlation between race and IQ while many others claim just the opposite, with perhaps the largest group holding that we simply dont know enough right now. Since I am rational, I too will say that we simply dont know.

The second issue is ethical: Is it right, wise or justified to publicly declare that one race is genetically inferior to another, given the immense societal consequences this could have? Not only could this claim empower white supremacists individuals who wouldnt be sympathetic with Harris follow-up claim that generalizations about a race of people dont warrant discriminating against individual members of that race but science tells us that such information can have direct and appreciable negative consequences for members of the targeted race. For example, stereotype threat describes how the mere mention that ones racial class is inferior can have measurable detrimental effects on ones cognitive performance. Similarly, teacher expectancy effects refer to this; if teachers are told that some students are smart and others are dumb, where the smart and dumb labels are randomly assigned, the smart students will statistically do better in class than the dumb ones.

To broadcast a scientifically questionable meme that could have serious bad effects for people already struggling in a society that was founded upon racism and is still struggling to overcome it is, I would argue, the height of intellectual irresponsibility.

Although the new atheist movement once filled me with a great sense of optimism about the future of humanity, this is no longer the case. Movements always rise and fall they have a life cycle, of sorts but the fall of this movement has been especially poignant for me. The new atheists of today would rather complain about trigger warnings in classrooms than eliminate rape on campuses. Theyd rather whine about safe spaces than help transgender people feel accepted by society. They loudly claim to support free speech and yet routinely ban dissenters from social media, blogs and websites.

They say they care about facts, yet refuse to change their beliefs when inconvenient data are presented. They decry people who make strong assertions outside of their field and yet feel perfectly entitled to make fist-poundingly confident claims about issues they know little about. And they apparently dont give a damn about alienating women and people of color, a truly huge demographic of potential allies in the battle against religious absurdity.

On a personal note, a recent experience further cemented my view that the new atheists are guilty of false advertising. A podcaster named Lalo Dagach saw that I had criticized Harris understanding of Islamic terrorism, which I believe lacks scholarly rigor. In response, he introduced me to his Twitter audience of 31,000 people as follows: Phil Torres (@xriskology) everyone. Mourns the loss of ISIS and celebrates attacks on atheists. Below this tweet was a screenshot of the last two articles I had written for Salonone about the importance of listening to the experts on terrorism, and the other about how the apocalyptic ideology of the Islamic extremists of ISIS is more likely to evolve into new forms than go extinct.

First of all, Dagachs tweet was overtly defamatory. I wrote him asking for a public apology and heard nothing back, although he quietly deleted the tweet. But even that did not happen until I had received a hailstorm of disturbing responses to Dagachs false statements, responses in the form of internet trolls aggressively defending Harris by asking me to kill myself and proposing new nicknames like Phil Hitler Torres (seriously!). This is the new atheist movement today, by and large. The great enemy of critical thinking and epistemological integrity, namely tribalism, has become the social glue of the community.

I should still be the new atheist movements greatest ally, yet today I want nothing whatsoever to do with it. From censoring people online while claiming to support free speech to endorsing scientifically unfounded claims about race and intelligence to asserting, as Harris once did, that the profoundly ignorant Ben Carson would make a better president than the profoundly knowledgeable Noam Chomsky, the movement has repeatedly shown itself to lack precisely the values it once avowed to uphold. Words that now come to mind when I think of new atheism are un-nuanced, heavy-handed, unjustifiably confident and resistant to evidence not to mention, on the whole, misogynist and racist.

And while there are real and immensely important issuesto focus on in the world, such as climate change, nuclear proliferation, food production, ocean acidification, the sixth mass extinction and so on, even the most cursory glance at any leading new atheists social-media feed reveals a bizarre obsession with what they call the regressive left. This is heartbreaking, because humanity needs thoughtful, careful, nuanced, scientifically minded thinkers more now than ever before.

See original here:
From the Enlightenment to the Dark Ages: How new atheism slid ... - Salon

Written by admin |

August 1st, 2017 at 1:41 am

Posted in Enlightenment

Enlightenment, Rogue-like Game Set To Hit Steam On August 4th … – One Angry Gamer (blog)

Posted: at 1:41 am


(Last Updated On: July 30, 2017)

Fans of rogue-like games have yet another title to look forward to that will be hitting Steam Early Access very soon, which is Coconut Island Games and LizardKings Enlightenment. The game is set to hit PC via Steam come August 4th.

Enlightenment is an action-shooter game with heavy doses of rogue-like features nestled into its core. The narrative is said to be unlinear and tasks players on an adventure into a wasteland, flaunting a mysterious dungeon known as The Ark. Enlightenment invites the curious and fans of rogue-like genres to indulge in a risky journey plagued by a monstrous crisis.

As for the story of Enlightenment, the content explaining said rogue-like game lies below for you to read over:

An asteroid wiped out civilization as we know it. Some wasteland tramp discovered that the asteroid shards grant possessors unexplained powers; so they founded this cult, calling it the Scientific Church of Enlightenment and this Church of Enlightenment built the Ark and they built a whole city around it. Its gonna be where the restoration of humanity starts, they said. But just look around you; these streets are all empty, not a soul to be seen at all now.

The underground settlement often referred to as The Ark is a dim complex that houses the dead bodies and the debased minded who dared to enter unprepared. The game tests the skill of players to see if they are yet another collection to the complex or a hero in the making.

In Enlightenment, you play in a roguelike action-shooter with a fast-paced challenging journey deep into a underground complex called the Ark. Players will be tested by large varieties of enemies in a procedurally-generated dungeon, and become stronger by learning from the inevitable deaths.

The developers not too long ago posted up the latest Enlightenment trailer along with its Steam Early Access page, which the former is up for your viewing pleasure.

As noted above, Enlightenment is set to drop on August 4th for PC via Steam Early Access. Additional information on this game can be found by checking out its main site.

Related

Link:
Enlightenment, Rogue-like Game Set To Hit Steam On August 4th ... - One Angry Gamer (blog)

Written by admin |

August 1st, 2017 at 1:41 am

Posted in Enlightenment

Personal computer – Wikipedia

Posted: July 30, 2017 at 2:32 pm


A personal computer (PC) is a multi-purpose electronic computer whose size, capabilities, and price make it feasible for individual use. PCs are intended to be operated directly by an end user, rather than by a computer expert or technician.

"Computers were invented to 'compute': to solve complex mathematical problems, but today, due to media dependency and the everyday use of computers, it is seen that 'computing' is the least important thing computers do."[1] The computer time-sharing models that were typically used with larger, more expensive minicomputer and mainframe systems, to enable them be used by many people at the same time, are not used with PCs.

Early computer owners in the 1960s, invariably institutional or corporate, had to write their own programs to do any useful work with the machines. In the 2010s, personal computer users have access to a wide range of commercial software, free software ("freeware") and free and open-source software, which are provided in ready-to-run form. Software for personal computers is typically developed and distributed independently from the hardware or OS manufacturers.[2] Many personal computer users no longer need to write their own programs to make any use of a personal computer, although end-user programming is still feasible. This contrasts with systems such as smartphones or tablet computers, where software is often only available through a manufacturer-supported channel, and end-user program development may be discouraged by lack of support by the manufacturer.

Since the early 1990s, Microsoft operating systems and Intel hardware have dominated much of the personal computer market, first with MS-DOS and then with Windows. Alternatives to Microsoft's Windows operating systems occupy a minority share of the industry. These include Apple's macOS and free open-source Unix-like operating systems such as Linux. Advanced Micro Devices (AMD) provides the main alternative to Intel's processors.

"PC" is an initialism for "personal computer". The IBM Personal Computer incorporated the designation in its model name, but IBM has not used this brand for many years. It is sometimes useful, especially in a marketing context, to distinguish personal computers of the "IBM Personal Computer" family from personal computers made by other manufacturers. For example, "PC" is used in contrast with "Mac", an Apple Macintosh computer.[3][4][5][6] This sense of the word is used in the Get a Mac advertisement campaign that ran between 2006 and 2009, as well as its rival, I'm a PC campaign, that appeared in 2008. Since none of these Apple products were mainframes or time-sharing systems, they were all "personal computers" and not "PC" (brand) computers.

The brain [computer] may one day come down to our level [of the common people] and help with our income-tax and book-keeping calculations. But this is speculation and there is no sign of it so far.

In the history of computing there were many examples of computers designed to be used by one person, as opposed to terminals connected to mainframe computers. It took a while for computers to be developed that meet the modern definition of a "personal computers", one that is designed for one person, is easy to use, and is cheap enough for an individual to buy.[8]

Using the narrow definition of "operated by one person", the first personal computer was the ENIAC which became operational in 1946.[9] It did not meet further definitions of affordable or easy to use.

An example of an early single-user computer was the LGP-30, created in 1956 by Stan Frankel and used for science and engineering as well as basic data processing.[10] It came with a retail price of $47,000equivalent to about $414,000 today.[11]

Introduced at the 1965 New York Worlds Fair, the Programma 101 was a printing programmable calculator[12][13] described in advertisements as a "desktop computer".[14][15][16][17] It was manufactured by the Italian company Olivetti and invented by the Italian engineer Pier Giorgio Perotto, inventor of the magnetic card system for program storage.[citation needed]

The Soviet MIR series of computers was developed from 1965 to 1969 in a group headed by Victor Glushkov. It was designed as a relatively small-scale computer for use in engineering and scientific applications and contained a hardware implementation of a high-level programming language. Another innovative feature for that time was the user interface combining a keyboard with a monitor and light pen for correcting texts and drawing on screen.[18] In what was later to be called the Mother of All Demos, SRI researcher Douglas Engelbart in 1968 gave a preview of what would become the staples of daily working life in the 21st century: e-mail, hypertext, word processing, video conferencing and the mouse. The demonstration required technical support staff and a mainframe time-sharing computer that were far too costly for individual business use at the time.

By the early 1970s, people in academic or research institutions had the opportunity for single-person use of a computer system in interactive mode for extended durations, although these systems would still have been too expensive to be owned by a single person. Early personal computersgenerally called microcomputerswere often sold in a kit form and in limited volumes, and were of interest mostly to hobbyists and technicians. Minimal programming was done with toggle switches to enter instructions, and output was provided by front panel lamps. Practical use required adding peripherals such as keyboards, computer displays, disk drives, and printers. Micral N was the earliest commercial, non-kit microcomputer based on a microprocessor, the Intel 8008. It was built starting in 1972 and about 90,000 units were sold. This had been preceded by the Datapoint 2200 in 1970, for which the Intel 8008 had been commissioned, though not accepted for use. The CPU design implemented in the Datapoint 2200 became the basis for x86 architecture used in the original IBM PC and its descendants.[19]

In 1973 the IBM Los Gatos Scientific Center developed a portable computer prototype called SCAMP (Special Computer APL Machine Portable) based on the IBM PALM processor with a Philips compact cassette drive, small CRT and full function keyboard. SCAMP emulated an IBM 1130 minicomputer in order to run APL1130.[20] In 1973 APL was generally available only on mainframe computers, and most desktop sized microcomputers such as the Wang 2200 or HP 9800 offered only BASIC. Because SCAMP was the first to emulate APL1130 performance on a portable, single user computer, PC Magazine in 1983 designated SCAMP a "revolutionary concept" and "the world's first personal computer".[20][21] This seminal, single user portable computer now resides in the Smithsonian Institution, Washington, D.C.. Successful demonstrations of the 1973 SCAMP prototype led to the IBM 5100 portable microcomputer launched in 1975 with the ability to be programmed in both APL and BASIC for engineers, analysts, statisticians and other business problem-solvers. In the late 1960s such a machine would have been nearly as large as two desks and would have weighed about half a ton.[20]

A seminal step in personal computing was the 1973 Xerox Alto, developed at Xerox's Palo Alto Research Center (PARC). It had a graphical user interface (GUI) which later served as inspiration for Apple Computer's Macintosh, and Microsoft's Windows operating system. The Alto was a demonstration project, not commercialized, as the parts were too expensive to be affordable.[8]

Also in 1973 Hewlett Packard introduced fully BASIC programmable microcomputers that fit entirely on top of a desk, including a keyboard, a small one-line display and printer. The Wang 2200 microcomputer of 1973 had a full-size cathode ray tube (CRT) and cassette tape storage.[22] These were generally expensive specialized computers sold for business or scientific uses. The introduction of the microprocessor, a single chip with all the circuitry that formerly occupied large cabinets, led to the proliferation of personal computers after 1975.

1974 saw the introduction of what is considered by many to be the first true "personal computer", the Altair 8800 created by Micro Instrumentation and Telemetry Systems (MITS).[23][24] Based on the 8-bit Intel 8080 Microprocessor,[25] the Altair is widely recognized as the spark that ignited the microcomputer revolution[26] as the first commercially successful personal computer.[27] The computer bus designed for the Altair was to become a de facto standard in the form of the S-100 bus, and the first programming language for the machine was Microsoft's founding product, Altair BASIC.[28][29]

In 1976, Steve Jobs and Steve Wozniak sold the Apple I computer circuit board, which was fully prepared and contained about 30 chips. The Apple I computer differed from the other kit-style hobby computers of era. At the request of Paul Terrell, owner of the Byte Shop, Steve Jobs was given his first purchase order, for 50 Apple I computers, only if the computers were assembled and tested and not a kit computer. Terrell wanted to have computers to sell to a wide range of users, not just experienced electronics hobbyists who had the soldering skills to assemble a computer kit. The Apple I as delivered was still technically a kit computer, as it did not have a power supply, case, or keyboard as it was delivered to the Byte Shop.

The first successfully mass marketed personal computer was the Commodore PET introduced in January 1977. However, it was back-ordered and not available until later in the year.[30] Five months later (June), the Apple II (usually referred to as the "Apple") was introduced,[31] and the TRS-80 from Tandy Corporation / Tandy Radio Shack in summer 1977, delivered in September in a small number. Mass-market ready-assembled computers allowed a wider range of people to use computers, focusing more on software applications and less on development of the processor hardware.

During the early 1980s, home computers were further developed for household use, with software for personal productivity, programming and games. They typically could be used with a television already in the home as the computer display, with low-detail blocky graphics and a limited color range, and text about 40 characters wide by 25 characters tall. Sinclair Research,[32] a UK company, produced the ZX Seriesthe ZX80 (1980), ZX81 (1981), and the ZX Spectrum; the latter was introduced in 1982, and totaled 8 million unit sold. Following came the Commodore 64, totaled 17 million units sold.[33][34]

In the same year, the NEC PC-98 was introduced, which was a very popular personal computer that sold in more than 18 million units.[35] Another famous personal computer, the revolutionary Amiga 1000, was unveiled by Commodore on July 23, 1985. The Amiga 1000 featured a multitasking, windowing operating system, color graphics with a 4096-color palette, stereo sound, Motorola 68000 CPU, 256KB RAM, and 880KB 3.5-inch disk drive, for US$1,295.[36]

Somewhat larger and more expensive systems (for example, running CP/M), or sometimes a home computer with additional interfaces and devices, although still low-cost compared with minicomputers and mainframes, were aimed at office and small business use, typically using "high resolution" monitors capable of at least 80 column text display, and often no graphical or color drawing capability. Workstations were characterized by high-performance processors and graphics displays, with large-capacity local disk storage, networking capability, and running under a multitasking operating system. Eventually, due to the influence of the IBM PC on the personal computer market, personal computers and home computers lost any technical distinction. Business computers acquired color graphics capability and sound, and home computers and game systems users used the same processors and operating systems as office workers. Mass-market computers had graphics capabilities and memory comparable to dedicated workstations of a few years before. Even local area networking, originally a way to allow business computers to share expensive mass storage and peripherals, became a standard feature of personal computers used at home.

In 1982 "The Computer" was named Machine of the Year by Time magazine. In the 2010s, several companies such as Hewlett-Packard and Sony sold off their PC and laptop divisions. As a result, the personal computer was declared dead several times during this period.[37]

A workstation is a high-end personal computer designed for technical, mathematical, or scientific applications. Intended primarily to be used by one person at a time, they are commonly connected to a local area network and run multi-user operating systems. Workstations are used for tasks such as computer-aided design, drafting and modeling, computation-intensive scientific and engineering calculations, image processing, architectural modeling, and computer graphics for animation and motion picture visual effects.[38]

Prior to the widespread usage of PCs, a computer that could fit on a desk was remarkably small, leading to the "desktop" nomenclature. More recently, the phrase usually indicates a particular style of computer case. Desktop computers come in a variety of styles ranging from large vertical tower cases to small models which can be tucked behind an LCD monitor. In this sense, the term "desktop" refers specifically to a horizontally oriented case, usually intended to have the display screen placed on top to save desk space. Most modern desktop computers have an external display screen and an external keyboard, which are typically plugged into the computer case.

A gaming computer is a standard desktop computer that typically has high-performance hardware, such as a more powerful video card, processor and memory, in order to handle the requirements of demanding video games, which are often simply called "PC games".[39] A number of companies, such as Alienware, manufacture prebuilt gaming computers, and companies such as Razer and Logitech market mice, keyboards and headsets geared toward gamers.

Single-unit PCs (also known as all-in-one PCs) are a subtype of desktop computers that combine the monitor and case of the computer within a single unit. The monitor often utilizes a touchscreen as an optional method of user input, but separate keyboards and mice are normally still included. The inner components of the PC are often located directly behind the monitor and many of such PCs are built similarly to laptops.

A subtype of desktops, called nettops, was introduced by Intel in February 2008, characterized by low cost and lean functionality. A similar subtype of laptops (or notebooks) is the netbook, described below. The product line features the new Intel Atom processor, which specifically enables nettops to consume less power and fit into small enclosures.

A home theater PC (HTPC) is a convergence device that combines the functions of a personal computer and a digital video recorder. It is connected to a TV set or an appropriately sized computer display, and is often used as a digital photo viewer, music and video player, TV receiver, and digital video recorder. HTPCs are also referred to as media center systems or media servers. The general goal in a HTPC is usually to combine many or all components of a home theater setup into one box. More recently, HTPCs gained the ability to connect to services providing on-demand movies and TV shows. HTPCs can be purchased pre-configured with the required hardware and software needed to add television programming to the PC, or can be cobbled together out of discrete components, what is commonly done with software support from MythTV, Windows Media Center, GB-PVR, SageTV, Famulent or LinuxMCE.

A laptop computer, also called a notebook, is a small personal computer designed for portability. Usually, all of the hardware and interfaces needed to operate a laptop, such as the graphics card, audio devices or USB ports (previously parallel and serial ports), are built into a single unit. Laptops usually have "clamshell" design, in which the keyboard and computer components are on one panel and a flat display screen on a second panel, which is hinged to the first panel. The laptop is opened for use and closed for transport. Closing the laptop also protects the screen and keyboard during transportation. Laptops have both a power cable that can be plugged in and high-capacity batteries that can power the device, enhancing its portability. Once the battery charge is depleted, it will have to be recharged through a power outlet. In the interests of saving power, weight and space, laptop graphics cards are in many cases integrated into the CPU or chipset and use system RAM, resulting in reduced graphics performance when compared to an equivalent desktop machine. For this reason, desktop or gaming computers are usually preferred to laptop PCs for gaming purposes.

One of the drawbacks of laptops is that, due to the size and configuration of components, usually relatively little can be done to upgrade the overall computer from its original design or add components. Internal upgrades are either not manufacturer-recommended, can damage the laptop if done with poor care or knowledge, or in some cases impossible, making the desktop PC more modular and upgradable. Desktop PCs typically have a case that has extra empty space inside, where users can install new components. Some internal upgrades to laptops, such as memory and hard disk drive upgrades are often easily performed, while a display or keyboard upgrade is usually difficult or impossible. Just like desktops, laptops also have the same input and output ports for connecting to a wide variety of devices, including external displays, mice, cameras, storage devices and keyboards, which may be attached externally through USB ports and other less common ports such as external video. Laptops are also a little more expensive compared to desktops, as the miniaturized components for laptops themselves are expensive.

A subtype of notebooks, called subnotebook, has most of the features of a standard laptop computer, but with smaller physical dimensions. Subnotebooks are larger than hand-held computers, and usually run full versions of desktop or laptop operating systems. Ultra-Mobile PCs (UMPC) are usually considered subnotebooks, or more specifically, subnotebook tablet PCs, which are described below. Netbooks are sometimes considered to belong to this category, though they are sometimes separated into a category of their own (see below).

A desktop replacement computer (DTR) is a personal computer that provides the full capabilities of a desktop computer while remaining mobile. Such computers are often actually larger, bulkier laptops. Because of their increased size, this class of computers usually includes more powerful components and a larger display than generally found in smaller portable computers, and can have a relatively limited battery capacity or none at all in some cases. Some use a limited range of desktop components to provide better performance at the expense of battery life. Desktop replacement computers are sometimes called desknotes, as a portmanteau of words "desktop" and "notebook", though the term is also applied to desktop replacement computers in general.[40]

Netbooks, also called mini notebooks or subnotebooks, are a subgroup of laptops[41] acting as a category of small, lightweight and inexpensive laptop computers suited for general computing tasks and accessing web-based applications. They are often marketed as "companion devices", with an intention to augment other ways in which a user can access computer resources.[41]Walt Mossberg called them a "relatively new category of small, light, minimalist and cheap laptops."[42] By August 2009, CNET called netbooks "nothing more than smaller, cheaper notebooks."[41] Initially, the primary defining characteristic of netbooks was the lack of an optical disc drive, requiring it to be a separate external device. This has become less important as flash memory devices have gradually increased in capacity, replacing the writable optical disc (e.g. CD-RW, DVD-RW) as a transportable storage medium.

At their inception in late 2007as smaller notebooks optimized for low weight and low cost[43]netbooks omitted key features (e.g., the optical drive), featured smaller screens and keyboards, and offered reduced specifications and computing power. Over the course of their evolution, netbooks have ranged in their screen sizes from below five inches[44] to over 13 inches,[45] with weights around ~1 kg (23 pounds). Often significantly less expensive than other laptops,[46] by mid-2009 netbooks had been offered to users "free of charge", with an extended service contract purchase of a cellular data plan.[47] In the short period since their appearance, netbooks have grown in size and features, converging with new smaller and lighter notebooks. By mid-2009, CNET noted that "the specs are so similar that the average shopper would likely be confused as to why one is better than the other," noting "the only conclusion is that there really is no distinction between the devices."[41]

A tablet is a type of portable PC that de-emphasizes the use of traditional input devices (such as a mouse or keyboard) by using a touchscreen display, which can be controlled using either a stylus pen or finger. Some tablets may use a "hybrid" or "convertible" design, offering a keyboard that can either be removed as an attachment, or a screen that can be rotated and folded directly over top the keyboard. Some tablets may run a traditional PC operating system such as Windows or Linux; Microsoft attempted to enter the tablet market in 2002 with its Microsoft Tablet PC specifications, for tablets and convertible laptops running Windows XP. However, Microsoft's early attempts were overshadowed by the release of Apple's iPad; following in its footsteps, most modern tablets use slate designs and run mobile operating systems such as Android and iOS, giving them functionality similar to smartphones. In response, Microsoft built its Windows 8 operating system to better accommodate these new touch-oriented devices.[48] Many tablet computers have USB ports, to which a keyboard or mouse can be connected.

The ultra-mobile PC (UMP) is a specification for small-configuration tablet PCs. It was developed as a joint development exercise by Microsoft, Intel and Samsung, among others. Current UMPCs typically feature the Windows XP, Windows Vista, Windows7, or Linux operating system, and low-voltage Intel Atom or VIA C7-M processors.

A pocket PC is a hardware specification for a handheld-sized computer (personal digital assistant, PDA) that runs the Microsoft Windows Mobile operating system. It may have the capability to run an alternative operating system like NetBSD or Linux. Pocket PCs have many of the capabilities of modern desktop PCs. Numerous applications are available for handhelds adhering to the Microsoft Pocket PC specification, many of which are freeware. Some of these devices also include mobile phone features, actually representing a smartphone. Microsoft-compliant Pocket PCs can also be used with many other add-ons like GPS receivers, barcode readers, RFID readers and cameras. In 2007, with the release of Windows Mobile 6, Microsoft dropped the name Pocket PC in favor of a new naming scheme: devices without an integrated phone are called Windows Mobile Classic instead of Pocket PC, while devices with an integrated phone and a touch screen are called Windows Mobile Professional.[49]

Computer hardware is a comprehensive term for all physical parts of a computer, as distinguished from the data it contains or operates on, and the software that provides instructions for the hardware to accomplish tasks. The boundary between hardware and software has become blurred, with the existence of firmware that is software "built into" the hardware. For example, a 2010-era LCD display screen contains a small computer inside. Mass-market consumer computers use highly standardized components and so are simple for an end user to assemble into a working system. Most 2010s-era computers only require users to plug in the power supply, monitor, and other cables. A typical desktop computer consists of a computer case (or "tower"), a metal chassis that holds the power supply, motherboard, hard disk drive, and often an optical disc drive. Most towers have empty space where users can add additional components. External devices such as a computer monitor or visual display unit, keyboard, and a pointing device (mouse) are usually found in a personal computer.

The motherboard connects all processor, memory and peripheral devices together. The RAM, graphics card and processor are in most cases mounted directly onto the motherboard. The central processing unit (microprocessor chip) plugs into a CPU socket, while the memory modules plug into corresponding memory sockets. Some motherboards have the video display adapter, sound and other peripherals integrated onto the motherboard, while others use expansion slots for graphics cards, network cards, or other I/O devices. The graphics card or sound card may employ a break out box to keep the analog parts away from the electromagnetic radiation inside the computer case. Disk drives, which provide mass storage, are connected to the motherboard with one cable, and to the power supply through another cable. Usually, disk drives are mounted in the same case as the motherboard; expansion chassis are also made for additional disk storage.

For large amounts of data, a tape drive can be used or extra hard disks can be put together in an external case. The keyboard and the mouse are external devices plugged into the computer through connectors on an I/O panel on the back of the computer case. The monitor is also connected to the input/output (I/O) panel, either through an onboard port on the motherboard, or a port on the graphics card. Capabilities of the personal computers hardware can sometimes be extended by the addition of expansion cards connected via an expansion bus. Standard peripheral buses often used for adding expansion cards in personal computers include PCI, PCI Express (PCIe), and AGP (a high-speed PCI bus dedicated to graphics adapters, found in older computers). Most modern personal computers have multiple physical PCI Express expansion slots, with some of the having PCI slots as well.

A computer case is an enclosure that contains the main components of a computer. They are usually constructed from steel or aluminum combined with plastic, although other materials such as wood have been used for specialized units. Cases are available in different sizes and shapes; the size and shape of a computer case is usually determined by the configuration of the motherboard that it is designed to accommodate, since this is the largest and most central component of most computers. The most popular style for desktop computers is ATX, although microATX and similar layouts became very popular for a variety of uses. Companies like Shuttle Inc. and AOpen have popularized small cases, for which FlexATX is the most common motherboard size. In the 1990s, desktop computer cases were larger and taller than 2010-era computer cases.

The power supply unit (PSU) converts general-purpose mains AC electricity to direct current (DC) for the other components of the computer. The rated output capacity of a PSU should usually be about 40% greater than the calculated system power consumption needs obtained by adding up all the system components. This protects against overloading the supply, and guards against performance degradation.

The central processing unit, or CPU, is a part of a computer that executes instructions of a software program. In newer PCs, the CPU contains over a million transistors in one integrated circuit chip called the microprocessor. In most cases, the processor plugs directly into the motherboard. The chip generates so much heat that the PC builder is required to attach a special cooling device to its surface; thus, modern CPUs are equipped with a fan attached via heat sink. IBM PC compatible computers use an x86-compatible microprocessor, manufactured by Intel, AMD, VIA Technologies or Transmeta. Apple Macintosh computers were initially built with the Motorola 680x0 family of processors, then switched to the PowerPC series; in 2006, they switched to x86-compatible processors made by Intel.

The motherboard, also referred to as system board or main board, is the primary circuit board within a personal computer, and other major system components plug directly into it or via a cable. A motherboard contains a microprocessor, the CPU supporting circuitry (mostly integrated circuits) that provide the interface between memory and input/output peripheral circuits, main memory, and facilities for initial setup of the computer immediately after power-on (often called boot firmware or, in IBM PC compatible computers, a BIOS or UEFI). In many portable and embedded personal computers, the motherboard houses nearly all of the PC's core components. Often a motherboard will also contain one or more peripheral buses and physical connectors for expansion purposes. Sometimes a secondary daughter board is connected to the motherboard to provide further expandability or to satisfy space constraints.

A PC's main memory is a fast primary storage device that is directly accessible by the CPU, and is used to store the currently executing program and immediately needed data. PCs use semiconductor random-access memory (RAM) of various kinds such as DRAM, SDRAM or SRAM as their primary storage. Which exact kind is used depends on cost/performance issues at any particular time. Main memory is much faster than mass storage devices like hard disk drives or optical discs, but is usually volatile, meaning that it does not retain its contents (instructions or data) in the absence of power, and is much more expensive for a given capacity than is most mass storage. As a result, main memory is generally not suitable for long-term or archival data storage.

Mass storage devices store programs and data even when the power is off; they do require power to perform read and write functions during usage. Although flash memory has dropped in cost, the prevailing form of mass storage in personal computers is still the hard disk drive. If the mass storage controller provides additional ports for expandability, a PC may also be upgraded by the addition of extra hard disk or optical disc drives. For example, BD-ROMs, DVD-RWs, and various optical disc recorders may all be added by the user to certain PCs. Standard internal storage device connection interfaces are PATA, Serial ATA and SCSI. Solid state drives (SSDs) are a much faster replacement for traditional mechanical hard disk drives, but are also more expensive in terms of cost per gigabyte.

A visual display unit, computer monitor or just display, is a piece of electrical equipment, usually separate from the computer case, which displays visual images without producing a permanent computer record. A display device was usually either a CRT in the 1980s, but by the 2000s, flat panel displays such as a TFT LCD had largely replaced the bulkier, heavier CRT screens. Multi-monitor setups are quite common in the 2010s, as they enable a user to display multiple programs at the same time (e.g., an email inbox and a word processing program). The display unit houses an electronic circuitry that generates its picture from signals received from the computer. Within the computer, either integral to the motherboard or plugged into it as an expansion card, there is pre-processing circuitry to convert the microprocessor's output data to a format compatible with the display unit's circuitry. The images from computer monitors originally contained only text, but as graphical user interfaces emerged and became common, they began to display more images and multimedia content. The term "monitor" is also used, particularly by technicians in broadcasting television, where a picture of the broadcast data is displayed to a highly standardized reference monitor for confidence checking purposes.

The video cardotherwise called a graphics card, graphics adapter or video adapterprocesses the graphics output from the motherboard and transmits it to the display. It is an essential part of modern multimedia-enriched computing. On older models, and today on budget models, graphics circuitry may be integrated with the motherboard, but for modern and flexible machines, they are connected by the PCI, AGP, or PCI Express interface. When the IBM PC was introduced, most existing business-oriented personal computers used text-only display adapters and had no graphics capability. Home computers at that time had graphics compatible with television signals, but with low resolution by modern standards owing to the limited memory available to the eight-bit processors available at the time.

A keyboard is an arrangement of buttons that each correspond to a function, letter, or number. They are the primary devices used for inputting text. In most cases, they contain an array of keys specifically organized with the corresponding letters, numbers, and functions printed or engraved on the button. They are generally designed around an operators language, and many different versions for different languages exist. In English, the most common layout is the QWERTY layout, which was originally used in typewriters. They have evolved over time, and have been modified for use in computers with the addition of function keys, number keys, arrow keys, and keys specific to an operating system. Often, specific functions can be achieved by pressing multiple keys at once or in succession, such as inputting characters with accents or opening a task manager. Programs use keyboard shortcuts very differently and all use different keyboard shortcuts for different program specific operations, such as refreshing a web page in a web browser or selecting all text in a word processor. In addition to the alphabetic keys found on a typewriter, computer keyboards typically have a numeric keyboard and a row of function keys and special keys, such as CNTRL, ALT, DEL and Esc.

A computer mouse is a small handheld device that users hold and slide across a flat surface, pointing at various elements of a graphical user interface with an on-screen cursor, and selecting and moving objects using the mouse buttons. Almost all modern personal computers include a mouse; it may be plugged into a computer's rear mouse socket, or as a USB device, or, more recently, may be connected wirelessly via an USB dongle or Bluetooth link. In the past, mice had a single button that users could press down on the device to "click" on whatever the pointer on the screen was hovering over. Modern mice have two, three or more buttons, providing a "right click" function button on the mouse, which performs a secondary action on a selected object, and a scroll wheel, which users can rotate using their fingers to "scroll" up or down. The scroll wheel can also be pressed down, and therefore be used as a third button. Some mouse wheels may be tilted from side to side to allow sideways scrolling. Different programs make use of these functions differently, and may scroll horizontally by default with the scroll wheel, open different menus with different buttons, etc. These functions may be also user-defined through software utilities. Mice traditionally detected movement and communicated with the computer with an internal "mouse ball", and used optical encoders to detect rotation of the ball and tell the computer where the mouse has moved. However, these systems were subject to low durability, accuracy and required internal cleaning. Modern mice use optical technology to directly trace movement of the surface under the mouse and are much more accurate, durable and almost maintenance free. They work on a wider variety of surfaces and can even operate on walls, ceilings or other non-horizontal surfaces.

All computers require either fixed or removable storage for their operating system, programs and user-generated material. Early home computers used compact audio cassettes for file storage; these were at the time a very low cost storage solution, but were displaced by floppy disk drives when manufacturing costs dropped, by the mid-1980s. Initially, the 5.25-inch and 3.5-inch floppy drives were the principal forms of removable storage for backup of user files and distribution of software. As memory sizes increased, the capacity of the floppy did not keep pace; the Zip drive and other higher-capacity removable media were introduced but never became as prevalent as the floppy drive. By the late 1990s, the optical drive, in CD and later DVD and Blu-ray Disc forms, became the main method for software distribution, and writeable media provided means for data backup and file interchange. As a result, floppy drives became uncommon in desktop personal computers since about 2000, and were dropped from many laptop systems even earlier.[note 1]

A second generation of tape recorders was provided when videocassette recorders were pressed into service as backup media for larger disk drives. All these systems were less reliable and slower than purpose-built magnetic tape drives. Such tape drives were uncommon in consumer-type personal computers but were a necessity in business or industrial use. Interchange of data such as photographs from digital cameras is greatly expedited by installation of a card reader, which is often compatible with several forms of flash memory devices. It is usually faster and more convenient to move large amounts of data by removing the card from the mobile device, instead of communicating with the mobile device through a USB interface.

A USB flash drive performs much of the data transfer and backup functions formerly done with floppy drives, Zip disks and other devices. Mainstream operating systems for personal computers provide built-in support for USB flash drives, allowing interchange even between computers with different processors and operating systems. The compact size and lack of moving parts or dirt-sensitive media, combined with low cost and high capacity, have made USB flash drives a popular and useful accessory for any personal computer user.

The operating system can be located on any storage, but is typically installed on a hard disk or solid-state drive. A Live CD represents the concept of running an operating system directly from a CD. While this is slow compared to storing the operating system on a hard disk drive, it is typically used for installation of operating systems, demonstrations, system recovery, or other special purposes. Large flash memory is currently more expensive than hard disk drives of similar size (as of mid-2014) but are starting to appear in laptop computers because of their low weight, small size and low power requirements. Computer communications involve internal modem cards, modems, network adapter cards, and routers. Common peripherals and adapter cards include headsets, joysticks, microphones, printers, scanners, sound adapter cards (as a separate card rather than located on the motherboard), speakers and webcams.

Computer software is any kind of computer program, procedure, or documentation that performs some task on a computer system.[51] The term includes application software such as word processors that perform productive tasks for users, system software such as operating systems that interface with computer hardware to provide the necessary services for application software, and middleware that controls and co-ordinates distributed systems.

Software applications are common for word processing, Internet browsing, Internet faxing, e-mail and other digital messaging, multimedia playback, playing of computer game, and computer programming. The user of a modern personal computer may have significant knowledge of the operating environment and application programs, but is not necessarily interested in programming nor even able to write programs for the computer. Therefore, most software written primarily for personal computers tends to be designed with simplicity of use, or "user-friendliness" in mind. However, the software industry continuously provide a wide range of new products for use in personal computers, targeted at both the expert and the non-expert user.

An operating system (OS) manages computer resources and provides programmers with an interface used to access those resources. An operating system processes system data and user input, and responds by allocating and managing tasks and internal system resources as a service to users and programs of the system. An operating system performs basic tasks such as controlling and allocating memory, prioritizing system requests, controlling input and output devices, facilitating computer networking, and managing files.

Common contemporary desktop operating systems are Microsoft Windows, macOS, Linux, Solaris and FreeBSD. Windows, macOS, and Linux all have server and personal variants. With the exception of Microsoft Windows, the designs of each of them were inspired by or directly inherited from the Unix operating system, which was developed at Bell Labs beginning in the late 1960s and spawned the development of numerous free and proprietary operating systems.

Microsoft Windows is the collective brand name of several operating systems made by Microsoft which, as of 2015, are installed on PCs built by HP, Dell and Lenovo, the three remaining high volume manufacturers.[52] Microsoft first introduced an operating environment named Windows in November 1985,[53] as an add-on to MS-DOS and in response to the growing interest in graphical user interfaces (GUIs)[54][55] generated by Apple's 1984 introduction of the Macintosh.[56] As of January 2017[update], the most recent client and server version of Windows are Windows 10 and Windows Server 2016.

macOS (formerly OSX) is a line of operating systems developed, marketed and sold by Apple Inc. macOS is the successor to the original Mac OS, which had been Apple's primary operating system since 1984. macOS is a Unix-based graphical operating system, and Snow Leopard, Leopard, Lion, Mountain Lion, Mavericks, Yosemite and El Capitan are its version codenames. The most recent version of macOS is codenamed macOS Sierra.

On iPhone, iPad and iPod, versions of iOS (which is an OSX derivative) are available from iOS1.0 to the recent iOS10. The iOS devices, however, are not considered PCs.

Linux is a family of Unix-like computer operating systems. Linux is one of the most prominent examples of free software and open source development: typically all underlying source code can be freely modified, used, and redistributed by anyone.[57] The name "Linux" refers to the Linux kernel, started in 1991 by Linus Torvalds. The system's utilities and libraries usually come from the GNU operating system, announced in 1983 by Richard Stallman. The GNU contribution is the basis for the alternative name GNU/Linux.[58]

Known for its use in servers, with the LAMP application stack as one of prominent examples, Linux is supported by corporations such as Dell, Hewlett-Packard, IBM, Novell, Oracle Corporation, Red Hat, Canonical Ltd. and Sun Microsystems. It is used as an operating system for a wide variety of computer hardware, including desktop computers, netbooks, supercomputers,[59] video game systems such as the Steam Machine or PlayStation3 (until this option was removed remotely by Sony in 2010[60]), several arcade games, and embedded devices such as mobile phones, portable media players, routers, and stage lighting systems.

Generally, a computer user uses application software to carry out a specific task. System software supports applications and provides common services such as memory management, network connectivity and device drivers, all of which may be used by applications but are not directly of interest to the end user. A simplified analogy in the world of hardware would be the relationship of an electric light bulb (an application) to an electric power generation plant (a system): the power plant merely generates electricity, not itself of any real use until harnessed to an application like the electric light that performs a service that benefits the user.

Typical examples of software applications are word processors, spreadsheets, and media players. Multiple applications bundled together as a package are sometimes referred to as an application suite. Microsoft Office and LibreOffice, which bundle together a word processor, a spreadsheet, and several other discrete applications, are typical examples. The separate applications in a suite usually have a user interface that has some commonality making it easier for the user to learn and use each application. Often, they may have some capability to interact with each other in ways beneficial to the user; for example, a spreadsheet might be able to be embedded in a word processor document even though it had been created in the separate spreadsheet application.

End-user development tailors systems to meet the user's specific needs. User-written software include spreadsheet templates, word processor macros, scientific simulations, graphics and animation scripts; even email filters are a kind of user software. Users create this software themselves and often overlook how important it is.

PC gaming is popular among the high-end PC market. According to an April 2014 market analysis, Gaming platforms like Steam (software), Uplay, Origin, and GOG.com (as well as competitive eSports titles like League of Legends) are largely responsible for PC systems overtaking console revenue in 2013.[61]

In 2001, 125 million personal computers were shipped in comparison to 48,000 in 1977.[62] More than 500 million personal computers were in use in 2002 and one billion personal computers had been sold worldwide from the mid-1970s up to this time. Of the latter figure, 75% were professional or work related, while the rest were sold for personal or home use. About 81.5% of personal computers shipped had been desktop computers, 16.4% laptops and 2.1% servers. The United States had received 38.8% (394 million) of the computers shipped, Europe 25% and 11.7% had gone to the Asia-Pacific region, the fastest-growing market as of 2002. The second billion was expected to be sold by 2008.[63] Almost half of all households in Western Europe had a personal computer and a computer could be found in 40% of homes in United Kingdom, compared with only 13% in 1985.[64]

The global personal computer shipments were 350.9 million units in 2010,[65] 308.3 million units in 2009[66] and 302.2 million units in 2008.[67][68] The shipments were 264 million units in the year 2007, according to iSuppli,[69] up 11.2% from 239 million in 2006.[70] In 2004, the global shipments were 183 million units, an 11.6% increase over 2003.[71] In 2003, 152.6 million computers were shipped, at an estimated value of $175 billion.[72] In 2002, 136.7 million PCs were shipped, at an estimated value of $175 billion.[72] In 2000, 140.2 million personal computers were shipped, at an estimated value of $226 billion.[72] Worldwide shipments of personal computers surpassed the 100-million mark in 1999, growing to 113.5 million units from 93.3 million units in 1998.[73] In 1999, Asia had 14.1 million units shipped.[74]

As of June 2008, the number of personal computers in use worldwide hit one billion,[75] while another billion is expected to be reached by 2014. Mature markets like the United States, Western Europe and Japan accounted for 58% of the worldwide installed PCs. The emerging markets were expected to double their installed PCs by 2012 and to take 70% of the second billion PCs. About 180 million computers (16% of the existing installed base) were expected to be replaced and 35 million to be dumped into landfill in 2008. The whole installed base grew 12% annually.[76][77]

Based on International Data Corporation (IDC) data for Q2 2011, for the first time China surpassed US in PC shipments by 18.5 million and 17.7 million respectively. This trend reflects the rising of emerging markets as well as the relative stagnation of mature regions.[78]

In the developed world, there has been a vendor tradition to keep adding functions to maintain high prices of personal computers. However, since the introduction of the One Laptop per Child foundation and its low-cost XO-1 laptop, the computing industry started to pursue the price too. Although introduced only one year earlier, there were 14 million netbooks sold in 2008.[79] Besides the regular computer manufacturers, companies making especially rugged versions of computers have sprung up, offering alternatives for people operating their machines in extreme weather or environments.[80]

In 2011, Deloitte consulting firm predicted that, smartphones and tablet computers as computing devices would surpass the PCs sales[83] (as has happened since 2012). As of 2013, worldwide sales of PCs had begun to fall as many consumers moved to tablets and smartphones for gifts and personal use. Sales of 90.3 million units in the 4th quarter of 2012 represented a 4.9% decline from sales in the 4th quarter of 2011.[84] Global PC sales fell sharply in the first quarter of 2013, according to IDC data. The 14% year-over-year decline was the largest on record since the firm began tracking in 1994, and double what analysts had been expecting.[85][86] The decline of Q2 2013 PC shipments marked the fifth straight quarter of falling sales.[87] "This is horrific news for PCs," remarked an analyst. "It's all about mobile computing now. We have definitely reached the tipping point."[85] Data from Gartner Inc. showed a similar decline for the same time period.[85] China's Lenovo Group bucked the general trend as strong sales to first time buyers in the developing world allowed the company's sales to stay flat overall.[85]Windows 8, which was designed to look similar to tablet/smartphone software, was cited as a contributing factor in the decline of new PC sales. "Unfortunately, it seems clear that the Windows 8 launch not only didnt provide a positive boost to the PC market, but appears to have slowed the market," said IDC Vice President Bob ODonnell.[86]

In August 2013, Credit Suisse published research findings that attributed around 75% of the operating profit share of the PC industry to Microsoft (operating system) and Intel (semiconductors).[88] According to IDC, in 2013 PC shipments dropped by 9.8% as the greatest drop-ever in line with consumers trends to use mobile devices.[89]

Selling prices of personal computers steadily declined due to lower costs of production and manufacture, while the capabilities of computers increased. In 1975, an Altair kit sold for only around US$400, but required customers to solder components into circuit boards; peripherals required to interact with the system in alphanumeric form instead of blinking lights would add another $2,000, and the resultant system was only of use to hobbyists.[90]

At their introduction in 1981, the US$1,795 price of the Osborne 1 and its competitor Kaypro was considered an attractive price point; these systems had text-only displays and only floppy disks for storage. By 1982, Michael Dell observed that a personal computer system selling at retail for about $3,000 US was made of components that cost the dealer about $600; typical gross margin on a computer unit was around $1,000.[91] The total value of personal computer purchases in the US in 1983 was about $4 billion, comparable to total sales of pet food. By late 1998, the average selling price of personal computer systems in the United States had dropped below $1,000.[92]

For Microsoft Windows systems, the average selling price (ASP) showed a decline in 2008/2009, possibly due to low-cost netbooks, drawing $569 for desktop computers and $689 for laptops at U.S. retail in August 2008. In 2009, ASP had further fallen to $533 for desktops and to $602 for notebooks by January and to $540 and $560 in February.[93] According to research firm NPD, the average selling price of all Windows portable PCs has fallen from $659 in October 2008 to $519 in October 2009.[94]

Personal computing can fulfill individual needs, but that fulfillment may come at a cost to society as well, especially in terms of environmental impact, although this impact differs between desktop computers and laptops.[95] Toxic chemicals found in some computer hardware include lead, mercury, cadmium, chromium, plastic (PVC), and barium. Overall, a computer is about 17% lead, copper, zinc, mercury, and cadmium; 23% is plastic, 14% is aluminum, and 20% is iron.[citation needed] Lead is found in a cathode ray tube (CRT) display, and on all of the printed circuit boards and most expansion cards.[citation needed] Mercury may be present in an LCD screen's fluorescent lamp backlight. Plastic is found mostly in the housing of the computation and display circuitry. While daily end-users are not exposed to these toxic elements, the danger arises during the computer recycling process, which involves manually breaking down hardware and leads to the exposure of a measurable amount of lead or mercury. A measurable amount of lead or mercury can easily cause serious brain damage or ruin drinking water supplies. Computer recycling is best handled by the electronic waste (e-waste) industry, and kept segregated from the general community dump.

Personal computers have become a large contributor to the 50 million tons of discarded electronic waste that is being generated annually, according to the United Nations Environment Programme. To address the electronic waste issue affecting developing countries and the environment, extended producer responsibility (EPR) acts have been implemented in various countries and states.[96] Organizations, such as the Silicon Valley Toxics Coalition, Basel Action Network, Toxics Link India, SCOPE, and Greenpeace have contributed to these efforts. In the absence of comprehensive national legislation or regulation on the export and import of electronic waste, the Silicon Valley Toxics Coalition and BAN (Basel Action Network) teamed up with 32 electronic recyclers in the US and Canada to create an e-steward program for the orderly disposal of manufacturers and customers electronic waste. The Silicon Valley Toxics Coalition founded the Electronics TakeBack Coalition, a coalition that advocates for the production of environmentally friendly products. The TakeBack Coalition works with policy makers, recyclers, and smart businesses to get manufacturers to take full responsibility of their products. There are organizations opposing EPR regulation, such as the Reason Foundation. They see flaws in two principal tenets of EPR: First EPR relies on the idea that if the manufacturers have to pay for environmental harm, they will adapt their practices. Second EPR assumes the current design practices are environmentally inefficient. The Reason Foundation claims that manufacturers naturally move toward reduced material and energy use.

Read more:
Personal computer - Wikipedia

Written by grays |

July 30th, 2017 at 2:32 pm

Seebohm’s Gold Symbolizes Personal Triumph Over Adversities – SwimSwam

Posted: at 2:32 pm


2017 FINA WORLD SWIMMING CHAMPIONSHIPS

Just when it looked as though the nation of Australia could possibly end its aquatic campaign in Budapest without a single gold medal to its credit, reigning world champion Emily Seebohm came to the rescue. After collecting a bronze in the 100 backstroke and clocking a new Commonwealth Record in the 50m back for 4th place already at this meet, Seebohm saved her best for last and crushed a monster time of 2:05.68 to win the 200m backstroke.

For 25-year-old Seebohm, her gritty performance was not only a victory for Australia, but it was also a personal triumph for the national team mainstay whos had a difficult year. After cruising to the 100m and 200m backstroke titles at 2015 World Championships in Kazan,Seebohm went on an absolute tear across the 2015/16 World Cup season.

By the time Rio rolled around, however, it was clear to the Brisbane Grammar swimmer that something wasnt right, as shefelt tired, crampy and sluggish. Finishing a disappointing 12th in the 200m back and off the podium in the 100m at the 2016 Olympics, Seebohm waited until after the Games to announced she had been suffering from symptoms of and ultimately was diagnosed with endometriosis. She eventually had surgery in December 2016 but refused to blame her lackluster performance in Rio on her health problems.

Giving swim fans perhaps the most emotional performance of her career, Seebohm couldnt hide her satisfaction in knowing she persevered and never gave up in the race.

Honestly, Im pretty relieved, Seebohm said as she choked back tears.

Im just really honoured and proud, such a fast field tonight and I was going to be proud of myself whether I won or I came last because getting back into the pool after Rio was really hard.

Everything Ive gone through it just proves to myself that it wasnt me, that Rio was just one of those things that happens in life and sometimes youve got to go down, to get back up.

I guess for me it was really hard after Rio, I knew there was a lot going on in my body and I really pushed through in Rio, Seebohm said.

After the surgery (for endometriosis) I got my wisdom teeth out in January, and then I had to rush back into the water and train really hard for this and Im just amazed at what I have achieved tonight.

I think what I did last year helped a lot, I was very mentally and physically tough last year even though I was struggling a lot it definitely helped me coming into this year, feeling better inside myself, feeling better inside my head and to come into this year and just absolutely enjoy every moment that Ive had its just been a fantastic meet.

In many ways, Seebohms race strategy tonight in Budapest was representative of her personal journey, maintaining her composure through the 150m mark and charging to the finish with pure guts and fueled by the sheer will to win.

Said Seebohm after the race regarding her strategy, I knew that Kathleen Baker was going to take it out pretty hard, because thats her style.

I know that people have seen me race the 200 backstroke like this many times before so for them, I think it was about trying to take it out hard because they think that will hurt me more in the back-end.

But it is all about focusing on your own race and you dont get carried away with focusing on what people are doing around you, because at the end of the day, the perfect race plan for yourself works best and I stuck to what I know and what Im good at and it worked out really well for me tonight.

Even young 16-year-old Taylor McKeown, who earned a new World Junior Record in the event representing Australia, was in awe of her teammates electric performance.

I think I was more happy with her swim than mine to be honest, it was good to see her get out there and claim the world title again.

Australia Medal Table Through Day 7:

Rank/Nation/Gold/Silver/Bronze/Total

Oceanic Records Through Day 7:

Read the original post:
Seebohm's Gold Symbolizes Personal Triumph Over Adversities - SwimSwam

Written by grays |

July 30th, 2017 at 2:32 pm

A vegan diet helps them win but are sports stars committed to the … – The Guardian

Posted: at 2:32 pm


At 3pm next Saturday the worlds first and only vegan football club will make sporting history when they play in the Football League for the first time. Forest Green Rovers, who were founded in the 19th century by a man named Peach, and play in green at the appropriately named New Lawn, take on Barnet the Bees in their first fixture in League Two. They will use their new status to spread the message of veganism around the sporting world.

Were having a big impact because were counterintuitive, said Dale Vince, the multimillionaire who owns the club, the green energy company Ecotricity, and is a big donor to the Labour party. Spreading the vegan word through the world of football what could be more counterintuitive than that?

Vince, who ensures that only plant-based food is available to players and spectators at the stadium in Nailsworth, Gloucestershire, is not alone. Famous athletes in a wide range of sports are forsaking meat and appear to be having a big impact on the number of people trying a plant-based diet.

Some of the worlds leading footballers, including Barcelonas Lionel Messi and Manchester Citys Sergio Agero, do not eat meat during the playing season, while England striker Jermaine Defoe has gone a big step further by taking up a vegan diet. Wimbledon finalist Venus Williams, an Italian rugby international, a US Olympic weightlifter, a number of hulking American football players, former heavyweight world champion David Haye, two snooker world champions and several top Australian cricketers are also on the vegan list.

Other sports with top-level vegans are wrestling, surfing, cycling, ice hockey, parkour the extreme gymnastics sport that may be added to the Olympic programme squash, bobsleigh, mixed martial arts fighting and ultra-running.

The surge had come in recent years, driven by digital media, said Vince. Its a mixture of people looking for an edge to improve performance, greater prevalence of knowledge, and more scientific evidence about the benefits, not just in sports performance but for human health, he said.

While Vince and the Vegan Society welcome the new wave of plant-powered sports stars, others are less happy because many are giving up meat for personal performance benefits, not because they are in tune with the vegan lifestyle and compassion for animals.

But Dominika Piasecka, media officer for the Vegan Society, welcomed the sporting newcomers and predicted there would be more. The sports stars influence has definitely helped to further the cause of veganism, she said. People really do take notice, especially if theyre a fan of the person.

The England-based Vegan Society, founded in 1944 when the movement began, is overseeing the fastest-growing lifestyle movement of the 21st century. Its researchers put the number of vegans in Britain at 542,000 up 260% in 10 years and estimate that about 1% of the population in Britain, Germany and the US is vegan. It is very likely that we will see an increase in the number of vegan sportspeople because more of them are starting to realise the benefits of a vegan diet, Piasecka said. Forest Green Rovers are breaking stereotypes and helping people to associate veganism with health, fitness and wellbeing.

Myths about the need to eat meat for protein have long since been disproved.

An aristocratic real tennis player who won a silver medal at the London 1908 Olympic Games was an early promoter of vegetarianism in Britain. Eustace Miles, a philanthropist, wrote Health Without Meat, which was a bestseller for years after it was published in 1915.

A few years later Paavo Nurmi, the Finn who was a vegetarian from boyhood, would establish himself as the greatest middle- and long-distance runner of the20th century. He won nine Olympic golds.

The Australian swimmer Murray Rose, nicknamed the seaweed streak because he ate a lot of seaweed in his vegan diet, was 17 when he won three Olympic golds in 1956.

In the television age Ed Moses, unbeaten for eight years at the 400m hurdles, and 100m world record holder Leroy Burrell were vegetarian, and the biggest name by far to adopt a vegan diet was Carl Lewis, the worlds most famous sprinter before the chicken-nugget lover Usain Bolt came along. The notorious boxer Mike Tyson also adopted a plant-based diet after he quit the ring.

But Lewis and Tyson stuck with it only for a few years and others may follow suit when their competing days are over.

Forest Greens vegan club label is not what it seems. The players are not vegan, they simply have to eat plant-based meals at matches and training. No animal products are on sale at the stadium. We dont check up on them away from the club but we hear that players are changing their approach [to their diet], and it happens with fans too, said Vince. Just doing these things and talking about it has an effect on everybody players, fans, even the media.

Venus Williams is widely seen, even by herself, as a cheagan, a cheating vegan who does not stick strictly to the lifestyle. A blogger on the ecorazzi website criticised Haye for using veganism as nothing but a vessel for self-promotion and adulation. For the sake of the animals we should pay no attention, nor give any credence to the positions of these athletes who are interested in nothing but their own careers.

But Vince believes this criticism is misguided. Its a mistake to be too purist about it. What we want to do is get the interest of the general public, we want them to know that a plant-based diet is easily available to them. It has to be accessible. Absolutely, these sportspeople can help.

Jason Gillespie, one of Australian crickets great bowlers until he retired in 2006, became a vegan during his highly successful five-year stint as coach of Yorkshire. Gillespie, who buys clothes in charity shops to support recycling, caused a stir when he suggested that a way should be found of making cricket balls without using leather, and when he questioned Yorkshires need to have a dairy company as a main sponsor. Hopefully one day the dairy industry can be shut down, he said at the time.

Gillespie is far more committed than most of the big-name vegans in sport, but he agrees with Vince. People who choose the vegan lifestyle on compassionate grounds will more likely stay with it as opposed to people who choose to eat vegan purely for performance or as a diet, Gillespie said from his home in Adelaide. I have no problem with athletes eating vegan purely for performance. If the vegan message is getting out there then that is a step in the right direction.

There is more awareness about veganism these days compared to when I was playing. I believe the vegan food industry is going to really expand in the next five years. Its very exciting.

Brenda Carey, owner and editor of Vegan Health and Fitness magazine in the US, is more wary than welcoming. As an idealist, I wish that everyone would choose to eat vegan to relieve the suffering of animals. I also wish that everyone would care enough about the environment to want to avoid animal products as the number one cause of environmental destruction on the planet.

There are many top vegan athletes who are vegan for these reasons and yes, they get a performance boost. However, there are many high-profile vegan athletes who never mention anything but their personal, selfish benefits from avoiding animal products and eating more plants. We call those people plant-based as they do not embrace a full vegan lifestyle, they just eat plants.

I get concerned about the people who simply follow this behaviour because their favourite athlete or celebrity has done so. Unfortunately people who choose to go vegan, which often equates to merely eating plant-based, for selfish reasons such as performance benefits or weight loss, tend to be the ones who fall off the wagon and go back to eating animal products later.

Carey is, though, a realist as well as an idealist, so Im just glad when anyone chooses to avoid eating animal products.

The strongest sporting voice against uncommitted plant-eaters is Neil Robinson, who became the first vegan footballer in Britain, and, he thinks, the world. He played for Everton, Swansea and others in a 16-year career from 1974, and says football is still in the stone age for vegans. He is critical of cheagans and fad dieters.

To a small degree I get what Dale [Vince] thinks but, I am a purist. Im always sceptical when I hear that a sportsperson or celebrity has become plant-based for health reasons. It dilutes veganism into being just a diet when in fact veganism is an ethos, a lifestyle of non-violence and compassion towards all living creatures.

Football is still in the stone age when it comes to embracing veganism. Clubs these days employ qualified nutritionists and it baffles me as to how these so-called experts can ignore the fact that animal foods are totally unnecessary for human health and, most importantly as far as theyre concerned, sporting performance. Individual sportsmen and women are now getting it but the clubs and authorities, with the exception of Forest Green Rovers, appear to be aeons away from embracing the plant food age.

Perhaps the best role model for plant-powered health is Fauja Singh, the 106-year-old runner who appeared alongside David Beckham in an advertising campaign and who ran the London marathon aged 101. He turned vegetarian in his 80s and eats mostly rotis, lentils, spinach and ginger. He takes only small portions and said:, If you ask me, more people die from overeating than starvation.

But when he wants a treat he goes to a place no vegan would ever visit McDonalds, for a milkshake. A small one, of course.

Follow this link:
A vegan diet helps them win but are sports stars committed to the ... - The Guardian

Written by admin |

July 30th, 2017 at 2:32 pm

The Zing is intact – Ahmedabad Mirror

Posted: at 2:32 pm


For perhaps the first time in the opening match of an away Test series, several things went Indias way. First, the coin flipped Virat Kohlis way and India had the chance to bat first on a flat track. Sri Lanka were then dealt a nasty blow when all-rounder Asela Gunaratne fractured a thumb on the first morning and could take no further part in the match. The strength of the home side was reduced again on the fourth day when spinner and stand-in captain, Rangana Herath, injured a finger on his delivery hand and didnt bowl for a major part of Indias second innings.

Those incidents, aided of course with a near-immaculate performance from the visitors, saw Sri Lanka capsize to a rather meek 304-run loss inside four day. India had left Sri Lanka with a 550-run target after declaring their second innings closed at 240/3. In reply, the home side managed just 245 that too largely due to a brilliant personal performance by Dimuth Karunaratne (97) and his sixth-wicket partnership of 101 runs with Niroshan Dickwella (67). In the morning, Kohli helped himself to his 17th Test hundred and 10th as captain.

Probably, the only real challenge that the Indian team faced thereafter was a dead track. But once again, as they did in their last domestic season, the bowling unit exerted constant pressure. The fast bowlers, Mohammed Shami and Umesh Yadav, secured early breakthroughs, and the spinners, R Ashwin and Ravindra Jadeja, demolished the lower order in quick time. The win comes following a rather messy coach selection process where if one is to believe the grapevine all the players wanted Anil Kumble sacked, Ravi Shastri back in the saddle. In a footnote, Virender Sehwag was rejected by the skipper as the Delhi dasher wanted his own support staff.

Just another Test match Kolhi, however, indicated that that affair was no longer relevant. Asked if a win like this helped to calm nerves, if any, he said: I think its another Test match for us, Kohli answered. The nerves being built up or the panic being created was all on the outside. Inside the change room, the atmosphere is absolutely same. We have just done what weve done in the past two years on a consistent basis. And weve just gone out there and focused on the game.

Thats all you think of doing and thats all that we are going to do in the future. That begs two questions: If everything is the same as it was in the last two years, then why was Shastri removed in 2016, and why did he replace Kumble in 2017? Answers can wait though, so lets return to the Galle Test and the huge win. India began the day at 189-3 with Kohli batting on 76. He hit Danushka Gunathilaka for a six to reach 89 and looked in no hurry as he scored the remaining 11 runs he needed to reach the three-figure mark in singles and twos.

He got there off 133 balls, with five boundaries and a six studding his innings. A key part of the innings was that he took 42 singles, 16 twos and a three on a wicket where finding the boundaries was difficult. Sri Lankas chase was a non-starter. They lost Upul Tharanga and Gunathilaka with just 29 on board. The Karunaratne- Dickwella partnership only delayed their demise. India ticked all the right boxes. The openers scored, the middle-order amassed runs, the fast bowlers picked early wickets, and the spinners found ways to snare wickets on a flat track. As Kohli said, Yeah, we would like to think so.

There are still some areas that we want to look at and try to improve upon; especially when you get four-five wickets and not letting the lower order to get away with the few runs in the latter half of the innings. Its something that we can still identify and work on in the next two games. But all in all, I think, playing Test cricket February is the last when we last played Test cricket and from then having played so much one-day cricket and not having the time to prepare so much in between, I think from that point of view it was good to get back into the groove and do things that are supposed to be done to win a Test match. There is always room for improvement. But as far as this series goes, Indian can afford not to improve. It is the Lankans who will have to entirely transform themselves if they are to take the fight to this Indian team.

More here:
The Zing is intact - Ahmedabad Mirror

Written by admin |

July 30th, 2017 at 2:32 pm

Read your personal letters out loud at his performance – Mid-Day

Posted: at 2:32 pm


A theatre artiste invites an audience to read aloud their personal correspondence, and a love letter that Karl Marx wrote to his wife, in a new performance

A guest reads a letter at The Reading Room. Pics courtesy/Anuja Ghosalkar

Before the performance begins, Anuja Ghosalkar steps in front of a 10-member audience and declares that she is the 'postmistress' for the evening. She carries envelopes, which contain letters that guests carried with them to the venue. She distributes them at random, inviting each person to read them aloud. For an hour, the room is filled with different voices and diverse narratives as each member reads to a bunch of strangers in an intimate setting. This comprises the Bengaluru-based theatre artiste's new performance project, The Reading Room. Produced by her documentary theatre company, Drama Queen, it comes to Mumbai next week.

Letters curated for a performance

The idea of the performance emerged from her earlier production, Lady Anandi, where Ghosalkar used her family archive to tell the story of her great-grandfather and 19th-century theatre actor, Madhavrao. "It revealed a lot about my family history. I wanted to push that idea and invite people to share something from their lives. It gets more exciting when a stranger reads the words you may have written," says Ghosalkar, who blurs boundaries between the audience and performer with this project. "There is an absolute honesty when non-actors read." Since December, she has curated six performances in Bengaluru.

Anuja Ghosalkar

While there is no selection criteria for audience letters, they need to be in English and not more than four minutes long. She also invites the audience to read letters that she has sourced from books, online archives, friends and borrowed from the public domain.

One of them is a love letter that Karl Marx wrote to his wife, which Ghosalkar received from her writer-director friend, Asmit Pathare. "The language in that in unlike any love letter one has read. There's also a letter from the National Centre of Biological Sciences, where a female scientist wrote about the role of women in science. It is short and succinct. The idea is to include diverse narratives. At the end of each performance, I request people to donate their letters to me, if they agree. Letters are a great archive of stories and histories," she sums up.

ON: August 1, 6 pm and 7.30 pmAT: The Mumbai Assembly, KCA Hall, 16 Veronica Road, Bandra West.EMAIL: anu.ghosalkar@gmail.comCALL: 9886741331ENTRY: Rs 150 (by registration only)

Trending Videos

Watch video: Mysterious deaths of starlets!

Download the new mid-day android app to get updates on all the latest and trending stories on the gohttps://goo.gl/8Xlcvr

Read the rest here:
Read your personal letters out loud at his performance - Mid-Day

Written by grays |

July 30th, 2017 at 2:32 pm

Simple, low-cost respiratory sensor measures and tracks personal metabolism – Tech Xplore

Posted: at 2:32 pm


The COBRA sensor is more portable and cost-effective than existing indirect calorimetry sensors. It includes a chest harness and bite grip that enable hands-free use of the system during exercise and training. Credit: MIT Lincoln Laboratory

The U.S. military has great interest in more comprehensive measurement and tracking of metabolism, both for optimizing the performance of warfighters under demanding physical conditions and for maintaining the health and wellness of forces during and after their military careers. While sensors for making metabolic measurements have existed for decades, they are expensive, cumbersome instruments primarily intended for clinical or professional use. MIT Lincoln Laboratory, in collaboration with the U.S. Army Research Institute for Environmental Medicine (USARIEM), has undertaken a research effort to create a low-cost personal metabolic sensor and an associated metabolic fuel model. The Carbon dioxide/Oxygen Breath and Respiration Analyzer (COBRA) enables individuals to make on-demand metabolic measurements simply by breathing into it.

"Besides assessing performance of soldiers in the field, the COBRA can be applied to broader purposes, such as training athletes for high-endurance activities, guiding weight loss by quantifying the impact of dietary and exercise regimens, or identifying nutritional imbalances," says Kyle Thompson, a member of the development team from Lincoln Laboratory's Mechanical Engineering Group.

Since the early 20th century, scientists have been using indirect calorimetry (IC) to calculate individual energy expenditure and metabolic rates. This method measures the ratio of carbon dioxide to oxygen in exhaled breath, which can be used to measure the levels of carbohydrates and fats being used by the body to meet metabolic energy needs. Information about energy expenditure rates is valuable for setting reasonable physical standards within the military. For example, limits on the distance and speed of foot marches can best be established by quantifying metabolic workloads of soldiers. The Soldier 2020 program is currently employing metabolic energy measurement to help establish job-related fitness requirements.

"For high-performance athletes or active-duty soldiers, optimally matching nutritional intake to the demands of a specific activity can improve performance and increase the likelihood of successful mission completion," says Gary Shaw, principal investigator on the laboratory's COBRA team. Physically demanding tasks can lead to glycogen depletion, which has a negative impact on performance. By tracking energy expenditure in real-time, soldiers could detect and avoid the onset of low glucose levels associated with glycogen depletion as well as other metabolic complications, such as heat stress.

While existing mobile IC sensors can make physiological measurements, they are expensive and complex to calibrate since their application has largely been limited to clinical studies, high-performance athletics, and field testing with small groups of subjects over limited periods of time. The COBRA sensor is smaller, simpler to use, and less costly to manufacture than existing IC sensors, enabling the measurement of individual energy expenditure for dozens of soldiers in a military field unit throughout the day. Lincoln Laboratory researchers hope to use such measurements to refine the personalized metabolic fuel model for individuals, track nutritional needs, and assess the impact of training on the individual's metabolic efficiency and endurance.

"The COBRA system is a breakthrough technology that promises to provide performance comparable to $30,000-$40,000 sensors at a fraction of the cost and with ease of use that makes personal ownership feasible," Shaw says.

USARIEM is currently testing and evaluating the COBRA sensor by comparing the COBRA measurements against those collected by laboratory-grade instruments. Once the sensor performance has been benchmarked in the laboratory, USARIEM will conduct small field studies to measure energy expenditure and nutrient consumption associated with different training exercises. Following successful field measurements, low-rate production of the COBRA sensor may be pursued in order to study energy expenditure and performance across dozens of soldiers over days of activity.

Beyond its use in studies of the performance of soldiers and athletes, the COBRA sensor and associated metabolic model can be applied to the management of the general population's metabolic health. It is anticipated that the COBRA sensor and metabolic model can be used to tailor dietary and exercise regimens for managing weight, inferring blood glucose and glycogen storage levels, and creating public databases on metabolic wellness and trends. This information could be used by clinicians and patients to aid in controlling obesity, which affects over one-third of Americans, and to provide a non-invasive indication of chronically high blood glucose, which is associated with the development of type-2 diabetes. According to the Centers for Disease Control and Prevention, nearly half of the adult population in the United States is either diabetic or pre-diabetic.

There are several promising avenues for the COBRA sensor's future. The researchers have applied for a patent and plan to conduct single-subject experiments to demonstrate how the sensor can be used in assessing nutritional imbalances. The laboratory will also seek opportunities to collaborate with other researchers interested in using COBRA as a tool in clinical studies, including those concerned with weight loss and endurance training.

Continued here:
Simple, low-cost respiratory sensor measures and tracks personal metabolism - Tech Xplore

Written by admin |

July 30th, 2017 at 2:32 pm

Empowerment – Wikipedia

Posted: at 2:31 pm


The term empowerment refers to measures designed to increase the degree of autonomy and self-determination in people and in communities in order to enable them to represent their interests in a responsible and self-determined way, acting on their own authority. Empowerment as action refers both to the process of self-empowerment and to professional support of people, which enables them to overcome their sense of powerlessness and lack of influence, and to recognize and use their resources.

The term empowerment originates from American community psychology and is associated[by whom?] with the social scientist Julian Rappaport (1981).[1]

In social work, empowerment forms a practical approach of resource-oriented intervention. In the field of citizenship education and democratic education, empowerment is seen[by whom?] as a tool to increase the responsibility of the citizen. Empowerment is a key concept in the discourse on promoting civic engagement. Empowerment as a concept, which is characterized by a move away from a deficit-oriented towards a more strength-oriented perception, can increasingly be found in management concepts, as well as in the areas of continuing education and self-help.[citation needed]

Robert Adams points to the limitations of any single definition of 'empowerment', and the danger that academic or specialist definitions might take away the word and the connected practices from the very people they are supposed to belong to.[2] Still, he offers a minimal definition of the term: 'Empowerment: the capacity of individuals, groups and/or communities to take control of their circumstances, exercise power and achieve their own goals, and the process by which, individually and collectively, they are able to help themselves and others to maximize the quality of their lives.'[3]

One definition for the term is "an intentional, ongoing process centered in the local community, involving mutual respect, critical reflection, caring, and group participation, through which people lacking an equal share of resources gain greater access to and control over those resources".[4][5]

Rappaport's (1984) definition includes: "Empowerment is viewed as a process: the mechanism by which people, organizations, and communities gain mastery over their lives."[6]

Sociological empowerment often addresses members of groups that social discrimination processes have excluded from decision-making processes through for example discrimination based on disability, race, ethnicity, religion, or gender. Empowerment as a methodology is also associated with feminism.

Empowerment is the process of obtaining basic opportunities for marginalized people, either directly by those people, or through the help of non-marginalized others who share their own access to these opportunities. It also includes actively thwarting attempts to deny those opportunities. Empowerment also includes encouraging, and developing the skills for, self-sufficiency, with a focus on eliminating the future need for charity or welfare in the individuals of the group. This process can be difficult to start and to implement effectively.

One empowerment strategy is to assist marginalized people to create their own nonprofit organization, using the rationale that only the marginalized people, themselves, can know what their own people need most, and that control of the organization by outsiders can actually help to further entrench marginalization. Charitable organizations lead from outside of the community, for example, can disempower the community by entrenching a dependence charity or welfare. A nonprofit organization can target strategies that cause structural changes, reducing the need for ongoing dependence. Red Cross, for example, can focus on improving the health of indigenous people, but does not have authority in its charter to install water-delivery and purification systems, even though the lack of such a system profoundly, directly and negatively impacts health. A nonprofit composed of the indigenous people, however, could ensure their own organization does have such authority and could set their own agendas, make their own plans, seek the needed resources, do as much of the work as they can, and take responsibility and credit for the success of their projects (or the consequences, should they fail).

The process of which enables individuals/groups to fully access personal or collective power, authority and influence, and to employ that strength when engaging with other people, institutions or society. In other words, "Empowerment is not giving people power, people already have plenty of power, in the wealth of their knowledge and motivation, to do their jobs magnificently. We define empowerment as letting this power out."[7] It encourages people to gain the skills and knowledge that will allow them to overcome obstacles in life or work environment and ultimately, help them develop within themselves or in the society.

To empower a female "...sounds as though we are dismissing or ignoring males, but the truth is, both genders desperately need to be equally empowered."[8] Empowerment occurs through improvement of conditions, standards, events, and a global perspective of life.

Before there can be the finding that a particular group requires empowerment and that therefore their self-esteem needs to be consolidated on the basis of awareness of their strengths, there needs to be a deficit diagnosis usually carried out by experts assessing the problems of this group. The fundamental asymmetry of the relationship between experts and clients is usually not questioned by empowerment processes. It also needs to be regarded critically, in how far the empowerment approach is really applicable to all patients/clients. It is particularly questionable whether mentally ill people in acute crisis situations are in a position to make their own decisions. According to Albert Lenz, people behave primarily regressive in acute crisis situations and tend to leave the responsibility to professionals.[9] It must be assumed, therefore, that the implementation of the empowerment concept requires a minimum level of communication and reflectivity of the persons involved.

In social work, empowerment offers an approach that allows social workers to increase the capacity for self-help of their clients. For example, this allows clients not to be seen as passive, helpless 'victims' to be rescued but instead as a self-empowered person fighting abuse/ oppression; a fight, in which the social worker takes the position of a facilitator, instead of the position of a 'rescuer'.[10]

Marginalized people who lack self-sufficiency become, at a minimum, dependent on charity, or welfare. They lose their self-confidence because they cannot be fully self-supporting. The opportunities denied them also deprive them of the pride of accomplishment which others, who have those opportunities, can develop for themselves. This in turn can lead to psychological, social and even mental health problems. "Marginalized" here refers to the overt or covert trends within societies whereby those perceived as lacking desirable traits or deviating from the group norms tend to be excluded by wider society and ostracized as undesirables.

According to Robert Adams, there is a long tradition in the UK and the USA respectively to advance forms of self-help that have developed and contributed to more recent concepts of empowerment. For example, the free enterprise economic theories of Milton Friedman embraced self-help as a respectable contributor to the economy. Both the Republicans in the US and the Conservative government of Margaret Thatcher built on these theories. 'At the same time, the mutual aid aspects of the concept of self-help retained some currency with socialists and democrats.'[11]

In economic development, the empowerment approach focuses on mobilizing the self-help efforts of the poor, rather than providing them with social welfare. Economic empowerment is also the empowering of previously disadvantaged sections of the population, for example, in many previously colonized African countries.[12]

Legal empowerment happens when marginalised people or groups use the legal mobilisation i.e., law, legal systems and justice mechanisms to improve or transform their social, political or economic situations. Legal empowerment approaches are interested in understanding how they can use the law to advance interests and priorities of the marginalised.[13]

According to 'Open society foundations' (an NGO) "Legal empowerment is about strengthening the capacity of all people to exercise their rights, either as individuals or as members of a community. Legal empowerment is about grass root justice, about ensuring that law is not confined to books or courtrooms, but rather is available and meaningful to ordinary people.[14]

Lorenzo Cotula in his book ' Legal Empowerment for Local Resource Control ' outlines the fact that legal tools for securing local resource rights are enshrined in legal system, does not necessarily mean that local resource users are in position to use them and benefit from them. The state legal system is constrained by a range of different factors from lack of resources to cultural issues. Among these factors economic, geographic, linguistic and other constraints on access to courts, lack of legal awareness as well as legal assistance tend to be recurrent problems.[15]

In many context, marginalised groups do not trust the legal system owing to the widespread manipulation that it has historically been subjected to by the more powerful. 'To what extent one knows the law, and make it work for themselves with 'para legal tools', is legal empowerment; assisted utilizing innovative approaches like legal literacy and awareness training, broadcasting legal information, conducting participatory legal discourses, supporting local resource user in negotiating with other agencies and stake holders and to strategies combining use of legal processes with advocacy along with media engagement, and socio legal mobilisation.[15]

Sometimes groups are marginalized by society at large, with governments participating in the process of marginalization. Equal opportunity laws which actively oppose such marginalization, are supposed to allow empowerment to occur. These laws made it illegal to restrict access to schools and public places based on race. They can also be seen as a symptom of minorities' and women's empowerment through lobbying.

Gender empowerment conventionally refers to the empowerment of women, which is a significant topic of discussion in regards to development and economics nowadays. It also points to approaches regarding other marginalized genders in a particular political or social context. This approach to empowerment is partly informed by feminism and employed legal empowerment by building on international human rights. Empowerment is one of the main procedural concerns when addressing human rights and development. The Human Development and Capabilities Approach, The Millennium Development Goals, and other credible approaches/goals point to empowerment and participation as a necessary step if a country is to overcome the obstacles associated with poverty and development.[16] The UN Sustainable Development Goals targets gender equality and women's empowerment for the global development agenda.[17]

According to Thomas A. Potterfield,[18] many organizational theorists and practitioners regard employee empowerment as one of the most important and popular management concepts of our time.

Ciulla discusses an inverse case: that of bogus empowerment.[19]

In the sphere of management and organizational theory, "empowerment" often refers loosely to processes for giving subordinates (or workers generally) greater discretion and resources: distributing control in order to better serve both customers and the interests of employing organizations.

One account of the history of workplace empowerment in the United States recalls the clash of management styles in railroad construction in the American West in the mid-19th century, where "traditional" hierarchical East-Coast models of control encountered individualistic pioneer workers, strongly supplemented by methods of efficiency-oriented "worker responsibility" brought to the scene by Chinese laborers. In this case, empowerment at the level of work teams or brigades achieved a notable (but short-lived) demonstrated superiority. See the views of Robert L. Webb.

During the 1980s and 1990s, empowerment has become a point of interest in management concepts and business administration. In this context, empowerment involves approaches that promise greater participation and integration to the employee in order to cope with their tasks as independently as possible and responsibly can. A strength-based approach known as "empowerment circle" has become an instrument of organizational development. Multidisciplinary empowerment teams aim for the development of quality circles to improve the organizational culture, strengthening the motivation and the skills of employees. The target of subjective job satisfaction of employees is pursued through flat hierarchies, participation in decisions, opening of creative effort, a positive, appreciative team culture, self-evaluation, taking responsibility (for results), more self-determination and constant further learning. The optimal use of existing potential and abilities can supposedly be better reached by satisfied and active workers. Here, knowledge management contributes significantly to implement employee participation as a guiding principle, for example through the creation of communities of practice.[20]

However, it is important to ensure that the individual employee has the skills to meet their allocated responsibilities and that the company's structure sets up the right incentives for employees to reward their taking responsibilities. Otherwise there is a danger of being overwhelmed or even becoming lethargic.[21]

Empowerment of employees requires a culture of trust in the organization and an appropriate information and communication system. The aim of these activities is to save control costs, that become redundant when employees act independently and in a self-motivated fashion. In the book Empowerment Takes More Than a Minute, the authors illustrate three keys that organizations can use to open the knowledge, experience, and motivation power that people already have.[7] The three keys that managers must use to empower their employees are:

According to Stewart, in order to guarantee a successful work environment, managers need to exercise the "right kind of authority" (p.6). To summarize, "empowerment is simply the effective use of a managers authority", and subsequently, it is a productive way to maximize all-around work efficiency.[22]

These keys are hard to put into place and it is a journey to achieve empowerment in a workplace. It is important to train employees and make sure they have trust in what empowerment will bring to a company.[7]

The implementation of the concept of empowerment in management has also been criticised for failing to live up to its claims.[23]

Read more:
Empowerment - Wikipedia

Written by grays |

July 30th, 2017 at 2:31 pm


Page 2,024«..1020..2,0232,0242,0252,026..2,0302,040..»



matomo tracker