Quotesia
Home
Authors
Popular authors
Letitia Elizabeth Landon
Hannah Arendt
Saul Bellow
Francis Bacon
Rene Descartes
Socrates
All authors
Today's birthdays
1939 - Margaret Atwood
1965 - Brene Brown
1714 - William Shenstone
1923 - Alan Shepard
1861 - Dorothy Dix
1953 - Alan Moore
Today's birthdays
Popular professions
Psychologist
Businesswoman
President
Actor
Author
Aviator
All professions
Authors by letter
A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z
All authors
Topics
Top Quotes
Quotesia
Favorite authors
Eliezer Yudkowsky Quotes
Eliezer Yudkowsky Quotes
Eliezer Yudkowsky
American
Writer
Born:
Sep 11
,
1979
Intelligence
People
Science
Think
World
You
Related authors:
Dale Carnegie
Denis Waitley
Dr. Seuss
H. L. Mencken
Napoleon Hill
Ray Bradbury
W. E. B. Du Bois
William Arthur Ward
By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.
Eliezer Yudkowsky
Artificial Intelligence
People
Intelligence
Too
Danger
Understand
Conclude
Greatest
Greatest Danger
Artificial
Far
Early
You cannot 'rationalize' what is not rational to begin with - as if lying were called 'truthization.' There is no way to obtain more truth for a proposition by bribery, flattery, or the most passionate argument - you can make more people believe the proposition, but you cannot make it more true.
Eliezer Yudkowsky
Truth
You
People
Argument
Believe
Lying
Way
More
More People
Rational
Rationalize
Proposition
True
Most
Obtain
Make
Passionate
Were
Begin
Cannot
Flattery
Bribery
A burning itch to know is higher than a solemn vow to pursue truth. To feel the burning itch of curiosity requires both that you be ignorant, and that you desire to relinquish your ignorance.
Eliezer Yudkowsky
Truth
You
Ignorance
Solemn
Relinquish
Both
Pursue
Higher
Feel
Know
Vow
Itch
Than
Curiosity
Burning
Requires
Ignorant
Your
Desire
The human species was not born into a market economy. Bees won't sell you honey if you offer them an electronic funds transfer. The human species imagined money into existence, and it exists - for us, not mice or wasps - because we go on believing in it.
Eliezer Yudkowsky
You
Money
Honey
Market
Market Economy
Born
Economy
Because
Go
Bees
Existence
Exists
Sell
Mice
Offer
Human
Transfer
Them
Us
Electronic
Human Species
Wasps
Species
Believing
Funds
Imagined
Nothing you'll read as breaking news will ever hold a candle to the sheer beauty of settled science. Textbook science has carefully phrased explanations for new students, math derived step by step, plenty of experiments as illustration, and test problems.
Eliezer Yudkowsky
News
You
Science
Problems
Will
Beauty
Nothing
Illustration
Carefully
Settled
Plenty
Students
Step
New
Read
Sheer
Test
Math
Textbook
Candle
Hold
Experiments
Breaking
Explanations
Derived
Ever
Do not flinch from experiences that might destroy your beliefs. The thought you cannot think controls you more than thoughts you speak aloud. Submit yourself to ordeals and test yourself in fire. Relinquish the emotion which rests upon a mistaken belief, and seek to feel fully that emotion which fits the facts.
Eliezer Yudkowsky
Thoughts
You
Yourself
Speak
Thought
Fire
Submit
Think
Destroy
Seek
Relinquish
More
Rests
Emotion
Facts
Feel
Aloud
Mistaken
Test
Fits
Than
Controls
Experiences
Cannot
Which
Might
Your
Fully
Belief
Beliefs
Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can.
Eliezer Yudkowsky
Truth
You
Surrender
Fight
Feeling
Own
Evidence
Though
About
Winds
Direction
Only
Foot
Cheated
Leaf
Beware
Retreat
Forced
Quickly
Blow
Against
Your
Ground
Lest
Each
Textbook science is beautiful! Textbook science is comprehensible, unlike mere fascinating words that can never be truly beautiful. Elementary science textbooks describe simple theories, and simplicity is the core of scientific beauty. Fascinating words have no power, nor yet any meaning, without the math.
Eliezer Yudkowsky
Beautiful
Science
Words
Simple
Simplicity
Power
Beauty
Unlike
Comprehensible
Never
Mere
Without
Scientific
Math
Nor
Truly
Textbook
Textbooks
Any
Meaning
Theories
Fascinating
Describe
Elementary
Core
In our skulls, we carry around 3 pounds of slimy, wet, greyish tissue, corrugated like crumpled toilet paper. You wouldn't think, to look at the unappetizing lump, that it was some of the most powerful stuff in the known universe.
Eliezer Yudkowsky
You
Think
Universe
Our
Paper
Carry
Some
Pounds
Powerful
Stuff
Like
Look
Most
Most Powerful
Known
Around
Wet
Tissue
Toilet
Toilet Paper
Lump
My successes already accomplished have mostly been taking existing science and getting people to apply it in their everyday lives.
Eliezer Yudkowsky
Science
People
Everyday
Everyday Lives
Taking
Mostly
Been
Accomplished
Existing
Getting
Successes
Lives
Apply
To be clever in argument is not rationality but rationalization.
Eliezer Yudkowsky
Argument
Clever
Rationality
The systematic experimental study of reproducible errors of human reasoning, and what these errors reveal about underlying mental processes, is known as the heuristics and biases program in cognitive psychology. This program has made discoveries highly relevant to assessors of global catastrophic risks.
Eliezer Yudkowsky
Risks
Made
Relevant
Systematic
About
Mental
Mental Processes
Catastrophic
Study
Highly
Global
Biases
Underlying
Known
Reveal
Discoveries
Errors
Human
Experimental
Psychology
Processes
Reasoning
Cognitive
Program
The obvious choice isn't always the best choice, but sometimes, by golly, it is. I don't stop looking as soon I find an obvious answer, but if I go on looking, and the obvious-seeming answer still seems obvious, I don't feel guilty about keeping it.
Eliezer Yudkowsky
Best
Sometimes
Looking
Guilty
Find
About
Seems
Feel
Soon
Obvious
Answer
Always
Still
Go
Stop
Choice
Keeping
When something is universal enough in our everyday lives, we take it for granted to the point of forgetting it exists.
Eliezer Yudkowsky
Enough
Everyday
Everyday Lives
Our
Something
Point
Take
Exists
Forgetting
Granted
Lives
Universal
Since the rise of Homo sapiens, human beings have been the smartest minds around. But very shortly - on a historical scale, that is - we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years.
Eliezer Yudkowsky
Technology
Intelligence
Few
Minds
Scale
Thousands
Thousands Of Years
Rise
Tens
Tens Of Thousands
Bound
Smartest
Since
Around
Been
Years
Historical
Very
Expect
Upper
Human
Human Beings
Break
Held
Sapiens
Beings
Homo
Homo Sapiens
Last
I keep trying to explain to people that the archetype of intelligence is not Dustin Hoffman in 'The Rain Man;' it is a human being, period. It is squishy things that explode in a vacuum, leaving footprints on their moon.
Eliezer Yudkowsky
Man
Rain
People
Intelligence
Human Being
Moon
Footprints
Period
Leaving
Trying
Dustin Hoffman
Human
Being
Explain
Explode
Keep
Things
Vacuum
Intelligence is the source of technology. If we can use technology to improve intelligence, that closes the loop and potentially creates a positive feedback cycle.
Eliezer Yudkowsky
Positive
Technology
Intelligence
Feedback
Positive Feedback
Potentially
Loop
Source
Improve
Closes
Creates
Cycle
Use
Transhumanists are not fond of death. We would stop it if we could. To this end, we support research that holds out hope of a future in which humanity has defeated death.
Eliezer Yudkowsky
Death
Hope
Future
Humanity
Research
Out
Would
Fond
Could
Support
End
Stop
Holds
Which
Defeated
If I could create a world where people lived forever, or at the very least a few billion years, I would do so. I don't think humanity will always be stuck in the awkward stage we now occupy, when we are smart enough to create enormous problems for ourselves, but not quite smart enough to solve them.
Eliezer Yudkowsky
Humanity
People
World
Smart
Problems
Will
Few
Stage
Think
Enormous
Enough
Ourselves
Would
Solve
Could
Stuck
Occupy
Always
Least
Years
Very
Forever
Quite
Where
Them
Create
If I Could
Billion
Lived
Now
Awkward
If you want to maximize your expected utility, you try to save the world and the future of intergalactic civilization instead of donating your money to the society for curing rare diseases and cute puppies.
Eliezer Yudkowsky
Future
You
World
Money
Try
Rare
Cute
Society
Save The World
Puppies
Civilization
Instead
Curing
Expected
Diseases
Maximize
Want
Your
Utility
Save
No more Eliezer Yudkowsky quotes
Haven't find the right quote? Try quotes from authors related to Eliezer Yudkowsky.
Dale Carnegie
Denis Waitley
Dr. Seuss
H. L. Mencken