• Nem Talált Eredményt

Factor Analysis

N/A
N/A
Protected

Academic year: 2022

Ossza meg "Factor Analysis"

Copied!
66
0
0

Teljes szövegt

(1)

Factor Analysis

PhD Course

(2)

• Factor analysis is used to draw inferences on unobservable quantities such as intelligence, musical ability, patriotism, consumer attitudes, that cannot be measured directly.

• The goal of factor analysis is to describe correlations between p measured traits in terms of variation in few underlying and unobservable factors.

• Changes across subjects in the value of one or more unobserved factors could affect the values of an entire subset of measured traits and cause them to be highly

correlated.

Factor Analysis - Introduction

(3)

• Good factor analysis can be expected if there is a strong correlation between variables involved in the study. At this point, there is a chance that the

common information space should be spanned by a small number of uncorrelated variables (they will be the common factors).

• The variables that are poorly correlated with other variables should be omitted from factor analysis.

Factor Analysis - Introduction

(4)
(5)

Factor Analysis - Example

• A marketing firm wishes to determine how consumers choose to patronize certain stores.

• Customers at various stores were asked to complete a survey with about p = 80 questions.

• Marketing researchers postulate that consumer choices are based on a few underlying factors such as: friendliness of personnel, level of customer service, store atmosphere, product assortment, product quality and general price level.

• A factor analysis would use correlations among responses to the 80 questions to determine if they can be grouped into six sub-groups that reflect variation in the six postulated factors.

(6)

Factor Analysis

The k-Factor Model

The vector of the observed variables are

Suppose that

(7)

Factor Analysis

where

(8)

Factor Analysis

Calculating the expectations of the sides

(9)
(10)

common factors

unique factor

(11)
(12)

The portion of the variance that is contributed by the k common factors is the

communality and the portion that is not explained by the common factors is called the uniqueness (or the specific variance).

(13)

covariance

(14)
(15)
(16)
(17)
(18)
(19)

The process of the factor analysis

v1……...vk O1

. . . . . . . . On

v1……...vk v1

. . . vk

v1 . . . vk

F1…..Fj

Data matrix

Covariance matrix Loading matrix

(20)
(21)
(22)

Illustrating rotation in a simple two-dimensional example:

-1 0 +1 1

0

1

a

b

The original variables

belonging the set a and b have a significant factor weight on both factors without rotation.

(23)

-1 0

+1 1

0

-1

a

b

After rotation the variables lying in the group 'a' has weight near zero at the first factor and the variables lying in the group 'b' has weight near zero at the other factor.

Illustrating rotation in a simple two-dimensional example:

(24)

The rotations which make the model be more meaningful:

Varimax

Quartimax

Equamax

An orthogonal rotation method that minimizes the number of variables that

have high loadings on each factor. This method simplifies the interpretation of

the factors

A rotation method that minimizes the number of factors needed to explain each variable. This method simplifies the interpretation

of the observed variables.

A mixed rotation method that is a combination of the varimax method, which simplifies the factors, and the quartimax method, which simplifies the variables. The number of variables that load highly

on a factor and the number of factors needed to explain a variable are minimized.

Direct Oblimin

A method for oblique (nonorthogonal) rotation. When delta equals 0 (the default), solutions are most oblique. As

delta becomes more negative, the factors become less oblique. To override

the default delta of 0, we can enter a number less than or equal to 0.8.

Promax Rotation

An oblique rotation, which allows factors to be correlated. This rotation can be

calculated more quickly than a direct oblimin rotation, so it is useful for large datasets.

(25)

VARIMAX rotation

(26)

VARIMAX rotation

where

the creating loading matrix

the ith cummunality

(27)

VARIMAX rotation

(28)

Kaiser-Meyer-Olkin (KMO) Test for Sampling Adequacy

There is only a chance for a good factor analysis if there is a strong correlation

between the variables included in the study. The relationship is tested by the KMO statistics:

where

The partial correlation coefficients The Pearson correlation

(29)

Kaiser-Meyer-Olkin (KMO) Test for Sampling Adequacy

For reference, Kaiser put the following values on the results:

• 0.00 to 0.49 unacceptable.

• 0.50 to 0.59 miserable.

• 0.60 to 0.69 mediocre.

• 0.70 to 0.79 middling.

• 0.80 to 0.89 meritorious.

• 0.90 to 1.00 marvelous.

Kaiser-Meyer-Olkin (KMO) Test is a measure of how suited your data is for Factor Analysis. The test measures sampling adequacy for each variable in the model and for the complete model.

(30)

Measure of Sampling Adequacy

From the start p variable, you should leave the ones having small MSAi value.

If the KMO statistic is not large enough, some variables for which MSA statistics are small are omitted. With this residual variables, the value of KMO statistic increases.

(31)

Before being able to run the factor analysis, one should ensure that the data has an adequate level of multicolinearity, the multicolinearity issue is not desirable in regression analysis but it is a

prerequisite here. Bartlett's measure tests the null hypothesis that the original correlation matrix is an identity matrix.

H0: The Correlation Matrix= E (Identity Matrix) H1: The Correlation Matrix≠ E (Identity Matrix)

The E identity matrix is the matrix in which all the diagonal elements are ones and the off diagonal elements are zeros. Meaning that there original data has no correlations among its variables.

Factor analysis cannot be performed on the data for which the correlation matrix is the identity matrix. Therefore, we want this test to be significant (i.e. has a significance value less than 0.05). If the P value is less than 0.05 we have to reject the null hypothesis thus there are some relationships between the variables we considered in the analysis.

Bartlett’s Test of Sphericity

(32)

Principal Component Analysis (PCA)

A special case of the factor analysis. This method leads to radical dimension

reduction. Instead of the originally used p variables we will express the statistical population wih k transformed variables, where k<<p.

The conclusions of the k-dimensional statistical analyzes will also apply to the p- dimensional population. This can save you considerable costs. It is possible to illustrate the p> 3 dimensional population (if k <4) on the scatter dot graph.

Furthermore, the variables in the new space will be uncorralated.

(33)

An example

At the processing multispectral digital satellite images a problem arises at displaying visual content. If the number of spectrums is more than three, then three must be

selected for the R, G, B channels to make a composite image on the display. Choosing any three ones of the existing spectrums will result in significant visual loss. However, if you select the first three principal components for display, we get a much better solution.

(34)

The principal component transformation is:

vector of the principal components

(A special factor model)

matrix of the principal direction vectors

(35)

The principal components are uncorralated:

The importance of the principal components decreases:

Shows how many percents Fi is being explained from the total

variation

(36)

The Scree Plot

The number of main components should be given there where

the scree plot begins to be "flat".

(37)

Meaning of the principal directions

where is the eigenvector of the covariance matrix to the largest eigenvalue it carries the most information

is the e.v. of the c. m. to the second largest eigenvalue

It carries the most information in the directions whics orthogonal onto

. . .

(38)

dimension reduction

If instead of the original p variables, only the first k principal factor-vector is counted the lost information is merely:

(39)

dimension reduction

The first principal component direction (z1) is the direction of the straight line in plane X around where the largest of the scattering of the points.

The second main component direction (z2) is perpendicular to z1.

(40)
(41)

where

(42)
(43)

Illustration in 3D

1. direction 3. direction

2. direction

Axes do not form square right

angles: variables

are correlated!

(44)

Illustration in 3D

75 100

2. teszt50

20 15

30

1 . 40t e

s z t

45

25 60

3 .

t e s z t

60

80

1. direction 100

3. direction

2. direction

We seek the longest axis of the scatter dots  this is the first principal direction

In this direction, we can best differentiate between points. The length (importance) of the main components is characterized by eigenvalue, which is the interpreted variance.

(45)

Illustration in 3D

75 100

2. teszt50

20 15

30

1 . 40t e

s z t

45

25 60

3 .

t e s z t

60

80

1st direction 100

3rd direction

2nd direction

Now we look for the longest axis perpendicular to the first principal direction 

this is the second principal direction

The procedure could be continued by finding the third

principal component, but in this particular case it makes

no sense, because scattering is already insignificant in

this direction so 2 dimensions are enough to describe the

data!

(46)

EXAMPLE 1

World95.sav file contains data about 109 countries. We execute the factor analysis with the next 19 variables.

populatn Population in thousands

density Number of people / sq. kilometer urban People living in cities (%)

lifeexpf Average female life expectancy lifeexpm Average male life expectancy literacy People who read (%)

pop_incr Population increase (% per year)

babymort Infant mortality (deaths per 1000 live births) gdp_cap Gross domestic product / capital

calories Daily calorie intake aids Aids cases

birth_rt Birth rate per 1000 people death_rt Death rate per 1000 people

aids_rt Number of aids cases / 100000 people b_to_d Birth to death ratio

fertilty Fertility: average number of kids cropgrow crop growth rate

lit_male Males who read (%) lit_fema Females who read (%)

(47)

Let's examine the relationship between the variables of the world 95 file! In the last coloumn stand the MSA indicators of the variables.

EXAMPLE 1

(48)

EXAMPLE 1

(49)

07/28/2022 Dr Ketskeméty László előadása 49

(50)

07/28/2022 Dr Ketskeméty László előadása 50

(51)
(52)
(53)
(54)

KMO „meritorious” !

A Bartlett-spherical test refuse the

independency

(55)

Communalities

1,000 ,450

1,000 ,975

1,000 ,751

1,000 ,935

1,000 ,880

1,000 ,851

1,000 ,775

1,000 ,916

1,000 ,705

1,000 ,653

1,000 ,810

1,000 ,882

1,000 ,711

Population in thousands Number of people / sq.

kilometer

People living in cities (%) Average female life expectancy

Average male life expectancy

People who read (%) Population increase (%

per year))

Infant mortality (deaths per 1000 live births) Gross domestic product / capita

Daily calorie intake Aids cases

Fertility: average number of kids

cropgrow

Initial Extraction

Extraction Method: Principal Component Analysis.

One hundred times the

value of communality shows how many percent of each variables can be

"explained" by the common factors. The variables with small communality value

are "point out" from the common information space.

If we left out them, a better factor analysis can be provided for the

remaining variables.

(56)

Total Variance Explained

6,654 51,186 51,186 6,654 51,186 51,186 6,515 50,115 50,115

1,448 11,138 62,324 1,448 11,138 62,324 1,511 11,625 61,740

1,169 8,991 71,316 1,169 8,991 71,316 1,184 9,109 70,849

1,022 7,860 79,176 1,022 7,860 79,176 1,083 8,327 79,176

,867 6,669 85,845

,546 4,196 90,041

,471 3,625 93,666

,306 2,357 96,023

,272 2,096 98,118

,125 ,959 99,077

,071 ,547 99,624

,040 ,310 99,933

,009 ,067 100,000

Component 1

2 3 4 5 6 7 8 9 10 11 12 13

Total % of Variance Cumulative % Total % of Variance Cumulative % Total % of Variance Cumulative % Initial Eigenvalues Extraction Sums of Squared Loadings Rotation Sums of Squared Loadings

Extraction Method: Principal Component Analysis.

With four factors, explanation is almost 80%, ie 13 dimensions reduced into 4 and

"only" 20% of the information was lost!

(57)

The decreasing of the importance of each principal components can see in the scree plot. In our case, the first four principal components were held..

(58)

Component Matrixa

-,062 ,563 ,331 ,140

,185 -,112 -,179 ,946

,779 -,371 ,024 ,079

,956 -,094 -,061 -,094

,926 -,125 -,037 -,065

,898 ,075 -,089 -,177

-,726 -,469 ,137 -,096

-,950 ,057 ,070 ,080

,757 -,117 ,304 ,164

,765 -,070 ,239 -,075

,097 ,147 ,878 ,085

-,903 -,237 ,099 -,015

,140 ,795 -,244 ,001

Population in thousands Number of people / sq.

kilometer

People living in cities (%) Average female life expectancy Average male life expectancy

People who read (%) Population increase (%

per year))

Infant mortality (deaths per 1000 live births) Gross domestic product / capita

Daily calorie intake Aids cases

Fertility: average number of kids

cropgrow

1 2 3 4

Component

Extraction Method: Principal Component Analysis.

4 components extracted.

a.

This table shows the loading matrix. It can be seen how weights are involved in the production of each variables

(59)

Rotated Component Matrixa

-,136 ,458 ,470 ,023

,090 ,012 -,037 ,982

,804 -,259 -,031 ,189

,964 ,039 -,071 ,019

,935 ,002 -,051 ,044

,894 ,197 -,072 -,081

-,648 -,580 -,014 -,134

-,951 -,077 ,068 -,029

,749 -,084 ,310 ,202

,776 -,035 ,220 -,027

,087 -,060 ,892 -,051

-,860 -,365 ,013 -,091

,039 ,840 -,043 -,034

Population in thousands Number of people / sq.

kilometer

People living in cities (%) Average female life expectancy Average male life expectancy

People who read (%) Population increase (%

per year))

Infant mortality (deaths per 1000 live births) Gross domestic product / capita

Daily calorie intake Aids cases

Fertility: average number of kids

cropgrow

1 2 3 4

Component

Extraction Method: Principal Component Analysis.

Rotation Method: Varimax with Kaiser Normalization.

Rotation converged in 5 iterations.

a. The loading matrix

generated after the varimax rotation.

After rotating, we get a better

interpretable model.

This helps us to understand the factors and to explore the

relationship system of variables.

(60)

If we suppress the small values in the table, it becomes clearer. It is apparent that only the factor 1 is involved in the production of the first two variables. So factor 1 is related to the level of culture.

(61)

Factor 2 is related to yield, agricultural development.

(62)

Factor 3 can be related to the

development of healthcare because it has a high value for variables such as number of aids cases, death rate.

(63)

Factor 4 is related to population densities.

(64)

Component Transformation Matrix

,987 ,125 ,027 ,097

-,118 ,959 ,235 -,106

,020 -,248 ,958 -,145

-,107 ,055 ,165 ,979

Component 1

2 3 4

1 2 3 4

Extraction Method: Principal Component Analysis.

Rotation Method: Varimax with Kaiser Normalization.

The orthogonal matrix of the rotation

In the space spanned by the first three principal direction vectors we can illustrate the locations of the examined variables.

(65)

07/28/2022 Dr Ketskeméty László előadása 65

In the data matrix, the principal direction

vectors were saved as new variables

(66)

With the first three principal direction vector each country can also be depicted in Figures 3-D. The points were colorized

according to the economic region. This also may help in the interpretation of factors.

Hivatkozások

KAPCSOLÓDÓ DOKUMENTUMOK

In conclusion, it is exactly this fundamental place-space duality that, while up- holding the curious substantiality of the city in its various aspects, leads to a certain vacuity

A common feature of the great number of methods developed thus far and actually applied for the investigation and evaluation of grinding tools is that they are

Farkas theorem: In reversible processes the heat elements absorbed by the bodies always have integrating divisors, and one of them is for each body an identical function of

The most important medieval Jewish visionary author before Dante was Abraham ibn Ezra, who lived in the first half of the twelfth century and spent some time of his life in Italy, at

Examining the influencing factors of the result variables in the modeled sample farm, I stated that the net income is affected by the yields of the diary farm,

Factor analysis is dealing with relationships between properties when de- scribing the mutual correlations of observed quantities by hidden background variables. One may ask,

The results indicate that when subjective factors are included in the analysis, these are capable of overwriting the effects of social background variables: gender

Although the method of creeping trends with harmonic weights provides reliable results, it should not be ignored that the qualitative variables are not included in the