Meaning of positivism
Definition of positivism
(noun)
the
form
of
empiricism
that bases all
knowledge
on
perceptual
experience
(not on
intuition
or revelation)
a
quality
or
state
characterized by
certainty
or
acceptance
or
affirmation
and
dogmatic
assertiveness
Other information on positivism
WIKIPEDIA results for
positivism
Amazon results for
positivism
Tweet