image

Statistics

 

What is the specialty of Statistics:

Statistics is the specialty of data collection, organization, analysis, interpretation, and presentation. When statistics are applied to a scientific, industrial or social problem, it is customary to start with a statistical set or model that is studied. Populations can be diverse groups of persons or objects such as "all persons living in a country" or "each atom consisting of a crystal." Statistics deal with every aspect of data, including planning data collection in terms of survey design and experiments.

When census data cannot be collected, statisticians collect data by developing specific experiment designs and survey samples. Representative sampling confirms that conclusions and conclusions can reasonably extend from the sample to the population as a whole. The pilot study involves taking measurements of the system under study, processing the system, and then taking additional measurements using the same procedure to determine whether manipulation has modified the values of the measurements. In contrast, the observational study does not contain empirical treatment.

Two main statistical methods are used in data analysis: Descriptive statistics, which summarizes data from a sample using indexes such as average or standard deviation, and deductive statistics, which draw conclusions from data that are subject to random variation (e.g., observation errors, sample variation). Descriptive statistics often concern themselves with two sets of distributive characteristics (a sample or a society): the central direction (or location) seeks to describe the central or typical value of the distribution, while dispersion (or variation) distinguishes the extent of which members of the distribution move away from one another's position and position. The semantics of mathematical statistics are performed within the framework of probability theory, which deals with the analysis of random phenomena.

The standard statistical procedure involves the collection of data that leads to the testing of the relationship between two sets of statistical data, or a set of data and compositional data derived from an ideal model. A hypothesis has been proposed for the statistical relationship between the two data sets and is compared as an alternative to an ideal blank hypothesis for the absence of a relationship between two data sets. The zero hypothesis is rejected or refuted using statistical tests that determine the meaning by which the error of noth can be proved, given the data used in the test. By working from the premise of noth, two basic forms of error are identified: Type I errors (an empty hypothesis was incorrectly rejected with "false positivity") and type II errors (failure of the nonhypothesis and loss of the actual relationship between populations). "false negativity"). Multiple problems have become associated with this framework, starting with obtaining sufficient sample size to determine an appropriate empty hypothesis.

Measurements that generate statistical data are also prone to error. Many of these errors are classified as random (noise) or systematic (bias), but other types of errors can also occur (for example, a serious error, such as when an analyst reports incorrect units). The existence of missing data or control may lead to biased estimates and specific techniques have been developed to address these problems.

History of Statistics:

Early writings on statistical inference date back to Arab mathematicians and cryptologists, during the Islamic Golden Age between the 8th and 13th centuries. Hebron (717-786) wrote the encryption letter book, which contains the first use of proofs and combinations, to list all possible Arabic words with and without vowels. A detailed description of how frequency analysis is used to decrypt encrypted messages. The Canadian also made the first known use of statistical inference, while he and other Arab encryption designers developed early statistical methods for decrypting encrypted messages. Ibn Adlan (1187-1268) later made an important contribution to the use of sample size in frequency analysis.

The earliest European writings on statistics dating back to 1663, with the publication of natural and political observations on death bills by John Grant. Early applications of statistical thinking revolved around the need for States to develop their own policy on demographic and economic data, and hence their own statistics. Statistical specialization expanded in the early nineteenth century to include the collection and analysis of data in general. Statistics are now widely used in government, business, natural and social sciences. The mathematical foundations of modern statistics were developed in the 17th century with the development of probability theory by Girolamo Cardano, Belize Pascal, and Pierre de Verma. The mathematical probability theory originated from the study of games of chance, although the concept of probability was already tested in medieval law and by philosophers such as Juan Caramuel, Adrien-Marie Legendre first described the method of small squares in 1805. carl Pearson, founder of Statistics Riyadh.

The modern field of statistics emerged in the late 19th and early 20th centuries in three stages. The first wave, at the turn of the century, was led by the work of Francis Galton and Carl Pearson, who transformed statistics into a strict mathematical system used for analysis, not only in science but also in industry and politics. Galton's contributions included the introduction of concepts of standard deviation, correlation, regression analysis, and the application of these methods to the study of a variety of human characteristics - length, weight, eye blink length, etc. As a product-moment, the moment method of fitting distributions to samples and distributing Pearson, among many other things. Galton and Pearson founded Biometrika as the first journal of mathematical and biometric statistics (then called biometric), and founded the world's first university statistical department at Universal College London.

Ronald Fisher coined the term "nil" during a tea-tasting experiment for the lady, which "was never proved or proved, but probably rebutted during the experiment.".

William Seely Gossett began the second wave of the 2000s, culminating in visions of Ronald Fisher, who wrote textbooks that were supposed to define academic discipline in universities around the world. Fisher's most important publications were his 1918 seminal paper, Relationship between Relatives on the Assumption of Mendelian Inheritance (which was the first to use the statistical term, variation), his 1925 classic work, Statistical Methods of Research Workers, and the 1935 Design of Experiments, where he developed a rigorous design of experimental models. He devised the concepts of sufficiency, additional statistics, Fisher's linear profile, and Fisher's information. In his 1930 book The Genetic Theory of Natural Selection, statistics were applied to various biological concepts such as the Fisher Principle (which AWF Edwards called "perhaps the most famous argument in evolutionary biology") and the Fisherian runaway, a concept in sexual selection about the positivity that found the effect of unbridled reactions in evolution.

The last wave, which essentially saw the refinement and expansion of previous developments, emerged from the collaborative work between Egon Pearson and Jerzy Niemann in the 1930s. They introduced the concepts of Type II error, test power, and confidence periods. In 1934, Jerzy Niemann showed that stratified randomized sampling was generally a better estimation method than targeted sampling (quotas). The use of modern computers accelerated the conduct of large-scale statistical calculations and also made it possible to do so.

The importance of studying the specialty of Statistics:

The field of statistics is data learning. Statistical knowledge helps you to use appropriate methods of data collection, use correct analyses, and deliver results effectively. Statistics are a critical process behind how we make discoveries in science, make decisions based on data, and predict. Statistics allow you to understand something a lot more deeply.

I cover two main reasons why the study of statistics is so important in modern society. First, statisticians are evidence to learn from data and browse common problems that can lead you to incorrect conclusions. Secondly, given the growing importance of data-based decisions and opinions, it is important that you be able to critically assess the quality of the analyses that others provide to you.

Statistics courses:

  • Modeling and statistical samples
  • Contemporary algebra
  • Mathematical equations
  • Economic statistics
  • Economic planning
  • Population census
  • Statistical systems and models
  • Econometrics
  • Numerical analysis
  • Agricultural statistics
  • Financial Economy
  • Statistical analysis programs and methods
  • Economics of globalization.

Fields of work for the Statistics major:

  • Banks
  • Statistical Analysis Companies and Offices
  • Insurance Companies
  • Computing Companies
  • Financial Institutions
  • Freelance
  • Stock Exchange
  • Economist