Improving and evaluating survey instruments

Similar documents
Summary. See OECD (2013). 6. See Statistics Sweden (2015). 7. See Swedish Migration Agency (2015).

How s Life in Sweden?

Court statistics 2018 Official statistics of Sweden

Court statistics 2015 Official statistics of Sweden

Court statistics official statistics of Sweden 2010

Georg Lutz, Nicolas Pekari, Marina Shkapina. CSES Module 5 pre-test report, Switzerland

Segregation and Employment in Swedish Regions

The Rights of the Child. Analytical report

Court statistics 2017 Official statistics of Sweden

SWEDISH TRENDS Henrik Oscarsson & Annika Bergström (ed.)

Submission by Save the Children Sweden related to the Human Rights Council Universal Periodical Review of Sweden 2010

Country Reports Nordic Region. A brief overview about the Nordic countries on population, the proportion of foreign-born and asylum seekers

How s Life in France?

The Rights of the Child. Analytical report

Plan against discrimination

On the dynamics of segregation

Telephone Survey. Contents *

Attitudes towards the EU in the United Kingdom

How s Life in the Netherlands?

Refugee Immigration and Media Consumption

How s Life in the Czech Republic?

Italy s average level of current well-being: Comparative strengths and weaknesses

Data Protection in the European Union. Data controllers perceptions. Analytical Report

The European Emergency Number 112

How s Life in Portugal?

How s Life in Ireland?

How s Life in Australia?

How s Life in Mexico?

Chile s average level of current well-being: Comparative strengths and weaknesses

How s Life in the Slovak Republic?

CSES Module 5 Pretest Report: Greece. August 31, 2016

EUROBAROMETER 67 PUBLIC OPINION IN THE EUROPEAN UNION SPRING This survey was requested and coordinated by Directorate-General Communication.

How s Life in Switzerland?

Employer Attitudes, the Marginal Employer and the Ethnic Wage Gap *

The National Citizen Survey

How s Life in Poland?

How s Life in the United Kingdom?

Thornbury Township Police Services Survey: Initial Data Analyses and Key Findings

How s Life in New Zealand?

Employer Attitudes, the Marginal Employer and the Ethnic Wage Gap *

How s Life in Austria?

The European emergency number 112

The Sudan Consortium African and International Civil Society Action for Sudan. Sudan Public Opinion Poll Khartoum State

EUROBAROMETER 62 PUBLIC OPINION IN THE EUROPEAN UNION

Report for the Associated Press: Illinois and Georgia Election Studies in November 2014

Equality Awareness in Northern Ireland: General Public

How s Life in Belgium?

How s Life in Hungary?

Characteristics of People. The Latino population has more people under the age of 18 and fewer elderly people than the non-hispanic White population.

How s Life in Norway?

Graffiti: An inventory of preventive measures

Survey sample: 1,013 respondents Survey period: Commissioned by: Eesti Pank Estonia pst. 13, Tallinn Conducted by: Saar Poll

Wisconsin Economic Scorecard

How s Life in Estonia?

Hispanic Attitudes on Economy and Global Warming June 2016

How s Life in Slovenia?

Spain s average level of current well-being: Comparative strengths and weaknesses

How s Life in Canada?

ANNUAL SURVEY REPORT: AZERBAIJAN

ANNUAL SURVEY REPORT: BELARUS

Community perceptions of migrants and immigration. D e c e m b e r

EUROPAFORUM NORTHERN SWEDEN

Uppsala Center for Fiscal Studies

DATA PROTECTION EXECUTIVE SUMMARY

How s Life in Germany?

The Effect of Immigrant Student Concentration on Native Test Scores

How s Life in the United States?

How s Life in Finland?

Unravelling Child Discrimination

ANNUAL SURVEY REPORT: ARMENIA

California Ballot Reform Panel Survey Page 1

EUROBAROMETER 62 PUBLIC OPINION IN THE EUROPEAN UNION AUTUMN

November 2017 Toplines

Data Protection in the European Union. Citizens perceptions. Analytical Report

Topline questionnaire

BY Aaron Smith FOR RELEASE JUNE 28, 2018 FOR MEDIA OR OTHER INQUIRIES:

ANNUAL SURVEY REPORT: REGIONAL OVERVIEW

The Cook Political Report / LSU Manship School Midterm Election Poll

SUMMARY. Migration. Integration in the labour market

Flash Eurobarometer 364 ELECTORAL RIGHTS REPORT

Iceland and the European Union

As might be expected, the two panels were different in their approaches to the question about the methodological and institutional implications of

How s Life in Denmark?

Vancouver Police Community Policing Assessment Report Residential Survey Results NRG Research Group

Executive summary. Strong records of economic growth in the Asia-Pacific region have benefited many workers.

Deliberative Polling for Summit Public Schools. Voting Rights and Being Informed REPORT 1

Special Eurobarometer 469. Report

Economic conditions and lived poverty in Botswana

Viktória Babicová 1. mail:

Movers and stayers. Household context and emigration from Western Sweden to America in the 1890s

Draft, please do not circulate or quote

DU PhD in Home Science

Executive summary. Part I. Major trends in wages

Post-referendum in Sweden

SOM. Swedish Opinion. Swedish Membership. European Union Sören Holmberg March on the. in the SOM INSTITUTE. Society Opinion Massmedia

Motivations and Barriers: Exploring Voting Behaviour in British Columbia

QUALITY OF LIFE IN TALLINN AND IN THE CAPITALS OF OTHER EUROPEAN UNION MEMBER STATES

How s Life in Iceland?

Voter ID Pilot 2018 Public Opinion Survey Research. Prepared on behalf of: Bridget Williams, Alexandra Bogdan GfK Social and Strategic Research

This memo was published originally as Appendix C to the 1996 Report of the Governor s Advisory Task Force on Civil Justice Reform.

Transcription:

Improving and evaluating survey instruments Survey embedded experiments using on line panels Lisa Kareliusson [SOM report no. 2011:31]

Improving and evaluating survey instruments: Survey embedded experiments using on line panels LISA KARELIUSSON Background Since 1986, the SOM Institute the University of Gothenburg conducts annual mail surveys in Sweden and in Swedish regions. Representative samples of people aged 16-85 years are asked questions about their media habits, leisure habits and their attitudes on different social issues. Several research projects are involved in the surveys and leading scholars in the Swedish social sciences appear in the books that the SOM Institute annually publishes. The annual surveys also provide descriptions of long-term trends based on previous studies. The Institute s analyses are often referred to in the public debate and can be considered to have a great impact on many levels of the society. Collection of data suitable for longitudinal comparisons requires continuous development work. Therefore, all surveys need to be carefully scrutinized and the quality of data continuously evaluated. In addition, it is the ambition of the SOM Institute to contribute to a general development of research methods and survey research. The SOM Institute is constantly trying to improve its work, its current methods are always subject to improvement, and new ones are continuously tested. Since a great deal of the collected data must be comparable over time, many questions need to be similarly formulated over the years. Yet, the surrounding society and people at large are constantly changing, implying that the SOM Institute must constantly revise the survey questions. In this work, methodological experiments for validating and constructing survey instruments are becoming increasingly important. This report is a part of this on-going process. The ultimate goal is to have all new survey instruments thoroughly pretested before they appear in a regular SOM-survey. In this aim the SOM-institute participated with a series of methodological split ballot experiments in a panel study called The Citizen Panel (M-panelen) in spring 2011. Through a methodological experiment it is possible to, for example, impose how the use of single words and turns of phrase or a different use of items in a survey question may, if circumstances are right, affect the answers and valuations (see f c Schuman & Presser 1996). With this knowledge it becomes possible to develop survey questions and examine how they affect the answers in order to improve the research methods and the surveys. This paper examines the outcomes and the conclusions of a series of methodological experiments that were embedded in a websurvey to an on line panel of respondents participating in the University of Gothenburg s Citizen panel. 1

Design The results are based on survey-embedded experiments which were included in a larger survey (M-panelen/ The Citizen Panel) where the respondents also got additional questions of various issues. In this report four sets of questions from the methodological test will be examined. The experiments reflect a general interest in how different formulations of survey questions affect the respondent s answers. The survey instruments examined in this report deals with four general areas. These areas are a) attitudes to cities in Sweden, b) attitudes to refugees, c) questions about siblings and finally a question that can still be considered to be controversial in a Swedish context, d) what sexual orientation the respondents have. The first three tests (a-c) were standard split-ballot experiments while the fourth test (d) can be characterized as an evaluation of how the survey question on sexual orientation was perceived by the respondents. The survey questions were designed and formulated by the SOM Institute and the practical work of transferring the questions to the web survey and to send out the survey and collect data was all conducted by the Laboratory of Opinion Research (LORe) at the University of Gothenburg. Data for the study were gathered via web-questionnaires which was sent out to participators of the Citizen Panel in April 2011. Most of the questions were followed by fixed alternatives, with accompanying boxes for respondents to fill in (see appendix 2). Population and sample The methodological test conducted by the SOM Institute included a total of 2 000 randomly selected people from the Citizen Panel s participants, which in total consists of about 8 000 respondents (Dahlberg, S et.al 2010 and Dahlberg, S et.al 2011). These 2 000 respondents where then randomly assigned into ten groups of 200 respondents each. These groups then received different questions or variations of questions or items that were included in the methodological tests. By comparing the different groups results it becomes possible to investigate how small changes in wording can affect the understanding and valuation of an issue. This paper examines the results of the methodological tests and focuses on how and if the answers changed depending on the phrasing of the survey questions, items or response alternatives. It also examines how the respondents managed to answer the questions, if there were any problems understanding the questions and how to answer them. The focus of the report will therefore be on how the questions worked and how changes in question format influence the answers. 2

Results Test 1. Popularity measurement of Swedish cities The first methodological test was designed to test how questions about attitudes to various Swedish cities work. We wish to develop a measure of the popularity of Swedish cities in order to invite more research on creative cities, and how cities may work to be more appealing to the creative class, i.e. in line with the thoughts of Richard Florida and others. This first test mainly aimed to analyze how, if at all, the respondents valuations were affected by how the scale was graded. The question the respondents received was simply What is your opinion on the following cities? and below the question there were sixteen of the largest cities in Sweden which the respondents were supposed to evaluate (in two groups of eight presented on two different web survey pages). The sample was here divided into two groups (split ballot randomized), group A and group B with 200 respondents in each group. Respondents in both groups were then asked to evaluate the cities along a 5-point scale running from 1 to 5. The experimental variation was a small difference in the phrasing of the end points of the scale. The difference lies in the upper end of the scale ( 5 ) where the word very was included in the scale for group B. The two different scales were A) Does not seem to be a good city to live in to seems to be a good city to live in and B) Does not seem to be a good city to live in to seems to be a very good city to live in. Findings On the question on how respondents value different cities the methodological test shows that the answers are slightly different depending on the scale. By making a comparison between the two groups (Group A and Group B) to see how they valued the various cities it is possible to recognize some differences. As the chart below illustrates Group A tend to value the cities higher than group B in general do. 3

Percentage that answered fairly good or good/very good Östersund Örebro Västerås Uppsala Umeå Sundsvall Stockholm Malmö Lund Linköping Kalmar Jönköping Helsingborg Halmstad Göteborg Borås 26 31 20 32 18 24 48 30 44 22 33 49 48 39 35 52 39 45 28 37 22 33 40 42 38 41 24 26 57 56 63 62 Group B Group A 0 10 20 30 40 50 60 70 The chart illustrates how many of the respondents, in % who valued the cities as fairly good or good/very good. The different scales between group A and B: A) 1=Does not seem to be a good city to live in, 5 = Seems to be a good city to live in, B) 1 = Does not seem to be a good city to live in, 5 =Seems to be a very good city to live in. Not only did Group B evaluate the cities more carefully when it came to the upper scale. The experiment also showed that there are differences at the lowest point of the scale. To some extent group B seems to evaluate cities fairly bad or bad in a greater extent than group A do (see chart 1 in appendix).this becomes clear when an analysis is made of the mean of the two groups evaluations. As the figure below illustrates, the mean is lower in group B than in group A on almost every city. This gives an indication that when the scale includes very on the highest point of the scale respondents tend to get more cautious in their valuation. 4

Mean valuations of the cities Östersund 2,9 3,0 Örebro 2,8 3,0 Västerås 2,9 2,9 Uppsala 3,5 3,6 Umeå 3,0 3,3 Sundsvall 2,8 3,0 Stockholm 3,4 3,4 Malmö 2,9 2,7 Lund 3,5 3,7 Linköping 3,2 3,4 Kalmar 3,0 3,2 Jönköping 2,8 2,9 Helsingborg 3,3 3,3 Halmstad 3,3 3,4 Göteborg 3,5 3,7 Borås 2,9 2,8 0,0 0,5 1,0 1,5 2,0 2,5 3,0 3,5 4,0 Group B Group A We now return to the primary question of whether the different scaling affected the evaluations of the cities. All in all the results indicate that there indeed are differences in the answers in the two groups. Not only, Group A tend to choose the highest scale in a greater extent they were also less likely to give low points in some extent. But one should not exaggerate the results since the significant is relatively low. All in all, the experiment on the valuation of Swedish cities was not instantaneously, to be specific the results were just significant on two cities, namely Västerås and Örebro (see table 1 in appendix A). A cautious conclusion can thus be drawn; the results indicate that the phrasing of the end points on the rating scale affects, in some extent, respondents evaluations of Swedish cities. Criteria for evaluations The respondents also got a follow-up question on what they take into account in their evaluation of the cities. This was of interest in order to get more information on how to formulate future questions and items on the subject. There was a few central themes which were mentioned several times and therefore can be interesting in the analysis of how and what a question regarding valuations of cities should include. The most common topics that the respondents said formed their evaluation of the cities were: the labor market, surroundings i.e. nature and water, communications, a wide cultural selection, proximity to universities, good health and social care, low crime rate, good environment for business and enterprises. These answers help us to formulate questions in the future, especially if we want to construct a closed end follow up question on what factors respondents regard as important when choosing 5

what city to live in. The topics mentioned above could be used to form items in the future, just to name one possible use. Additional cities Since one aim of the test was to examine how the question of the cities works and how it best could be formed it is also interesting to analyze what cities that seemed to be missing in the list, since there is a possibility that more cities should be included. There were a few cities that occurred several times on the alternative other city, these were Trollhättan, Gävle, Karlstad, Eskilstuna, Luleå and Uddevalla. However, these responses may have been shaped by the fact that a large part of the panel participants actually live in these cities. We have long known that some regions are over-represented in the MOD/LORe Citizen panel and some of the cities mentioned above are situated in these regions. It could also be interesting to analyze how the respondents valuated cities that they suggested themselves, in the field other city (See appendix B). A trend in the answers is that both groups tended to mention cities they valuated high and very few mentioned cities they valuated low. Comments At the end of the survey the respondents were asked to leave their comments on the questionnaire and there were some who commented on the block concerning Swedish cities. A recurring comment was that it was difficult to answer the question since it was unclear what the question actually was referring to. There were a number of respondents who claimed that they needed more information on what aspects they were supposed to base their evaluations on. It simply seems as the question in some extent was too broad to understand. In addition, there were a few respondents who said that they had never been in the city, or simply knew too little about it, which made it difficult to comment on it. An analysis of the comments from the respondents shows that they experienced some difficulties in answering the question since they did not understand how they were supposed to rate the cities and what they should take into account when they value a city. This indicates that it is important to specify the question in some way. One way to solve this problem could be by using the results from this methodological test and make use of items like the various topics mentioned above. 6

Test 2. Anxieties The second test aimed to examine how just a change of wording and additional clarifications in one item can affect the respondents attitudes regarding refugees. The main question was Given the situation today, what do you experience being most worrying for the future? For this aim the sample was divided into three different groups (33/33/33 split ballot method experiment), with 200 respondents in each group. Each group got different variations of one item (item 4). The different variations of the items were: Increased amount of refugees, increased amount of refugees in Sweden, increased amount of refugees in the world. Except from this change all other items were exactly the same for all the groups. The other items were the following: terrorism, pollution, economic crisis, wide spread unemployment, weakened democracy, organized crime and increase in social inequality. The respondents were then asked to evaluate these various items on a scale from very anxious, quite anxious, not so anxious to not anxious at all. The aim of the test was to inform the scholars linked to the SOM Institute whether respondents make different evaluations of their anxiety depending on shift of perspective. It was also of interest to examine which of the various formulations of the item that resulted in the highest and lowest percent worried respondents. Findings As the figure below illustrates there was a great difference in the answers depending on the formulation of the item. The result of the test shows that the group that got the most specific and close-to-themselves item (increased amount of refugees in Sweden) were least likely to state they were worried or fairly worried. "Given the situation today, what do you experience being worrying for the future?" 90 80 70 60 50 40 30 20 10 0 49,6 Increased amount of refugees 39 Increased amount of refugees in Sweden 77,1 Increased amount of refugees in the world The figure illustrates how many of the respondents, in percent, that has stated that they are fairly worried or worried. Most striking is the large difference between the two first groups and the group that got the item stressing an increased amount of refugees in the world, where almost 80 % were worried 7

in some degree. The point estimate is almost twice as high compared to the proportion of panelists that, in some extent, was worried of an increased amount of refugees in Sweden. Most worrying is an increased amount of refugees in the world whereas only 39 percent of the respondents are worried by an increased amount of refugees in Sweden. The results of the test show that the formulation of the items affects how worried the respondents say they are regarding the question of refugees. There is a significant difference in responses between the group receiving the item increased number of refugees in the world (n=170) and the group receiving the item increased number of refugees (N=125) were 49 % of the respondents compared to 77 % indicated that they were very or quite worried (p=<0,001). Likewise, it is a significant difference in responses between the group receiving the item increased number of refugees in Sweden (n=159) and the group receiving the item increased number of refugees in the world in which only 39% indicated that they were very or quite worried (see chart 2 in appendix). To investigate how reliable these results are it can also be fruitful to analyze the differences between the groups on other items in the question. When it comes to the question of terrorism all three groups valuate their worriedness very similar to each other (49,6 percent/50,6 percent/47,6 percent). The same is the case of pollution where no distinctive differences can be found between the three groups (12 percent/13,7 percent/16,4 percent). This indicates that the big differences in the answers about refugees were more because of the shift of perspective in the items than because of other differences between the groups. There was only one single comment after this question which is interesting in this matter. Slightly amusing a respondent took up the fact that the question did not specify where the influx of refugees would increase which made it difficult to answer the question. This respondent had been received the item that was very broad formulated (Increased amount of refugees) which also is the one used in the surveys at present. Although the purpose of the methodological test was to examine how the responses were affected by the change of wording in the first place it also flow into new knowledge on other aspects of the issue which can be of help in coming surveys. As the test illustrates it really does matter, at least in this case, how the questions are formulated. Regarding the issue of refugees the results are also interesting. What does it mean that people are worried about more refugees in the world? What do they read in to that? War? Natural disasters? This is not something we can know anything about from the test but could be an interesting to test in upcoming waves of the Citizen panel. 8

Test 3. Siblings The next experiment concerned two versions of a survey instrument about respondents siblings. This block was divided into a number of sub queries and was sent to 200 respondents who all got the same questions. This experiment was more a test of a new type of question in order to analyze how it works. For this aim the analysis of the answers will include details regarding the response rate and how the respondents managed to answer the question, i.e. if they understood the question and the guidelines. Findings The first question the respondents got was did you grow up with one or more siblings? (the answer alternatives were yes or no). According to the results 56,5 percent of the respondents had one or more siblings. This first question seemed to be the easiest to answer and of 200 respondents 154 answered, the response rate was thus 77 percent. However, this question was not in focus, but it can serve as a help for subsequent questions and not least as an analytical tool. Thereafter the respondents who answered yes were asked to enter year of birth and gender for each of the siblings they grew up with. As the italics indicate it was the siblings they have lived together with that was in focus. The question was followed by five rows where the respondents could fill in birth date and gender for each sibling (See appendix B). The figure below illustrates the results of how many siblings the respondents have and the results seem to be reasonable when it comes to how the answers were distributed between the number of siblings. 50 45 40 35 30 25 20 15 10 5 0 Average score, number of siblings in percent 43 32 16 4 5 1 Sibling 2 Siblings 3 siblings 4 siblings 5 siblings The test also revealed some difficulties and what details that has to be improved. The main purpose of the methodological test was, as mentioned before, to test if the respondents were able to answer the question. All of the 134 respondents who said that they have one or more siblings on the first question also filled in the birth date on the first row and all but one also filled in the gender on the second question. 9

One problem that appeared was that there were a number of respondents who did not seem to understand that they were supposed to rank the siblings in chronological order and where to put themselves. This indicates that a more detailed introduction text may be necessary. In the survey there were fixed options for the year of birth in a scale from Before 1910 to 1995. The experiment made it clear that it at least should be an option for later than since there was a number of respondents who could not answer the question correctly since they had siblings born later than 1995. Another conclusion drawn from the results is that there should be more than five rows, since there was a small number of respondents who stated that they had more than five siblings and therefore was not able to answer the question fully. 10

Test 4. Sexual orientation The question about sexual orientation is interesting to test in various means since it can be considered to be a sensitive and most personal issue. Therefore, a cautious test was carried out to investigate in what extent the respondents were likely to answer the question and perhaps above all to see the respondents reactions. A group of 200 respondents got the question What is your sexual orientation with following answer alternatives: heterosexual, homosexual, bisexual, other (please specify), or just fill in the I do not want to answer alternative. In the purpose of not being too distinctive, the question was asked in connection with other background questions and was carefully placed not to let it be shown alone on its own page in the survey (see appendix B). The title of the block was thus "a few questions about yourself" and as a beginning the respondents were asked about their education to prevent the question about sexual orientation to become too protruding. Findings The methodological test of this question signals that it seems to be no overhanging difficulties in asking about the sexual orientation, at least not in an elite panel as the MOD/LORe Citizen panel. The response rate on the question is somewhat lower compared to random selected questions as it is 70 % compared to around 77 % on the other questions. In the same manner the results give no signs of that the respondents did not want to answer. As the table below illustrates only 5 % of the respondents said they did not want to answer. Heterosexual 87,9% Homosexual 2,9% Bisexual 4,3% I do not want to answer 5,0% It may also be interesting to see how the responses are distributed among different age group. The table below illustrates how different age groups have answered the question. However, the groups are of very different size why the results are reported in exact number instead of percent since comparisons between the groups would be biased. As illustrated in the table none of the groups answers that stand out much from the other and not any of the groups that was more likely to refrain from responding. Age Heterosexual Homosexual Bisexual Do not want to answer n 16-29 10 0 3 2 15 30-49 42 2 2 3 49 50-69 36 2 0 0 38 70-85 5 0 0 0 5 Furthermore, the reactions on the question were not so many as the research team first thought it would be and the reactions were fewer and actually more mixed than expected. Since it was mainly the reactions that were in focus of the methodological test it was more the comments in the comment fields that was of interest than the actual answers on the question and therefore these will 11

be in focus here. There were too few comments to make any conclusions of how the respondents in general reacted on the question. But the fact that there were so few comments could also indicate that most of the respondent did not react in either a positive or negative way on this question. Among the comments there were some who indicated that they did not know what sexual orientation they had or that there is not something static. One conclusion one can draw from this is that there could be fruitful to add an alternative for I am unsure of my sexual orientation, which has been used elsewhere (see Livsvillkor och hälsa bland unga homo- och bisexuella resultat från nationella folkhälsoenkäten, Folkhälsoinstitutet). Precisely because of this it may also be good to have an option for other for those who feel they do not fit in in any of the other alternatives. One comment was formulated as a question which stressed that it is an irrelevant question which just aimed to categorize people. One similar comment stressed that the sexual orientation could not be of any relevance for the other answers. This indicates that it also seems to be important to have a clear aim with the question of sexual orientation to refer to. Final words The methodological tests yielded highly variable results, some more easily analyzed and conclusive than others. However, it becomes clear that the methodological tests contribute to new knowledge, which in turn can contribute to new and better questions, or to the confirmation of the old ones. This new knowledge can in turn lead to improvements in research and hopefully to new and exciting questions to illuminate different social issues. 12

References Dahlberg, S et.al (2010) The 2010 Internet Campaign Panel. MOD Working Paper Series 2010:3. http://www.mod.gu.se/digitalassets/1334/1334411_1326233_mod-working-paper- 2011_1---the-2010-internet-panel---dahlberg-et-al.pdf Dahlberg, S et.al (2011) Documentation version 20110826. Multidisciplinary Opinion and Democracy Research Group, University of Gothenburg. http://www.mod.gu.se/digitalassets/1341/1341820_lore-dokumentation--hela-dokumentet-.pdf Schuman, H & Presser, S (1996). Questions & answers in attitude surveys. Thousand Oaks, CA: Sage. Statens folkhälsoinstitut. Livsvillkor och hälsa bland unga homo- och bisexuella. http://www.fhi.se/documents/aktuellt/nyheter/pm-unga-hbt-livsvillkor-halsa-0907.pdf 13

Appendix A In this appendix charts and tables referred to in the text is illustrated. Chart 1 The chart illustrates how many percent of the groups that valued the cities as bad or fairly bad. Chart 2 Very+ quite anxious (%) not so anxious not anxious at all (%) N (102) Increased amount of refugees 49,6 50,4 125 (103) Increased amount of refugees in Sweden (104) Increased amount of refugees in the world 39,0 61,0 159 77,1 22,9 170 The difference between (102) and (103) is not significant (Sig. 0,074) The difference between (102) and (104) is significant at the 99%-level (Sig. 0,000) The difference between (103) and (104) is significant at 99%-level (Sig. 0,000) 14

Table 1: Valuations of the cities Group A Group B A) B) 1 2 3 4 5 Sum n 1 2 3 4 5 Sum n Sig XA XB Sig Borås 12 26 37 19 7 100 137 9 24 43 17 7 100 123 0,86 2,83 2,9 0,292 Göteborg 2 11 24 36 26 100 140 7 8 30 36 20 100 123 0,23 3,74 3,54 0,493 Halmstad 1 16 41 29 12 100 140 2 11 49 27 11 100 122 0,605 3,35 3,33 0,34 Helsingborg 5 14 39 26 16 100 139 6 15 39 27 13 100 122 0,98 3,33 3,27 0,726 Jönköping 12 25 30 24 9 100 139 9 28 41 20 2 100 123 0,074 2,94 2,79 0,035 Kalmar 6 15 41 24 13 100 140 6 25 41 20 8 100 122 0,234 3,22 2,99 0,214 Linköping 4 13 39 34 11 100 140 5 17 40 32 7 100 121 0,704 3,35 3,19 0,681 Lund 2 8 27 44 19 100 141 2 10 35 37 15 100 124 0,479 3,7 3,54 0,649 Malmö 21 28 16 29 6 100 140 21 16 24 30 9 100 122 0,16 2,71 2,89 6,82 Stockholm 9 15 27 27 21 100 142 6 16 29 31 18 100 124 0,84 3,35 3,37 0,308 Sundsvall 9 18 41 29 4 100 140 6 31 41 20 2 100 122 0,087 3,01 2,81 0,961 Umeå 6 14 36 35 9 100 139 7 22 40 24 6 100 124 0,173 3,29 3,01 0,275 Uppsala 2 9 32 40 17 100 141 1 11 41 37 11 100 123 0,299 3,62 3,46 0,354 Västerås 11 19 46 18 6 100 140 2 29 51 16 2 100 122 0,02 2,9 2,87 0,034 Örebro 8 22 38 25 7 100 138 7 31 43 20 0 100 122 0,016 3,01 2,75 0,323 Östersund 9 18 42 23 8 100 137 10,9 21 42 21,8 4,2 100 119 0,686 3,04 2,87 0,853 15

Appendix B The questions in the web-survey In this part of appendix all questions that were asked in the methodlogical test are illustrated eactly the way they were designed in the web-survey. 16

17

18

19

The SOM Institute at the University of Gothenburg conducts yearly national and local surveys in Sweden, and gives seminars on Society, Opinion and Media. The SOM Institute Seminariegatan 1B Po box 710, SE-405 30 Gothenburg, Sweden +46 31 786 3300 info@som.gu.se www.som.gu.se