Basit öğe kaydını göster

dc.contributor.advisorAdalı, Tuğba
dc.contributor.authorBulut, Tevfik
dc.date.accessioned2019-05-10T07:28:05Z
dc.date.issued2019-05-10
dc.date.submitted2019-04-24
dc.identifier.citationAAPOR. (2016). Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys. Adedokun, O. and Burgess, W. (2011). Analysis of Paired Dichotomous Data: A Gentle Introduction to the McNemar Test in SPSS. Journal Of MultiDisciplinary Evaluation, 8(17):125-131. Retrieved from http://journals.sfu.ca/jmde/index.php/jmde_1/article/view/336. Bajpai, N. and Sachs, J. D. (2000). Foreign Direct Investment In India: Issues and Problems. Harvard University, Harvard Institute for International Development. Development Discussion Paper No: 759. p.1-21. Barnette, J. J. (2000). Effects of Stem and Likert Response Option Reversals on Survey Internal Consistency: If You Feel the Need, There is a Better Alternative to Using those Negatively Worded Stems. Educational and Psychological Measurement, 60(3):361-370. https://doi.org/10.1177/00131640021970592. Bartholomew, S. and Smith, A. D. (2006). Improving Survey Response Rates from Chief Executive Officers in Small Firms: The Importance of Social Networks. ET&P. 30(1):83-96. https://doi.org/10.1111/j.1540-6520.2006.00111.x Baruch, Y. and Holtom, B. C. (2008). Survey response rate levels and trends in organizational research. Human Relations. 61(8):1139-1160. Doi:10.1177/0018726708094863. Bech, M. and Kristensen, M. B. (2009). Differential response rates in postal and Web-based surveys among respondents. Survey Research Methods, 3(1):1-6. doi:http://dx.doi.org/10.18148/srm/2009.v3i1.592. Berg, S. V. (2009). Investing in Infrastructure: Factors Affecting Sector Performance. Public Utility Research Center, University of Florida. p.1-11. Bialowolski, P. and Weziak-Bialowolska, D. (2013). External Factors Affecting Investment Decisions of Companies. Economics Discussion Papers. Kiel Institute for the World Economy, No: 2013-44. http://www.economicsejournal.org/economics/discussionpapers/2013-44. Biemer, P. P. and Lyberg, L. E. (2003). Introduction to Survey Quality. John Wiley & Sons, Inc., Hoboken, New Jersey, Doi:10.1002/0471458740. Brinkman, W. P. (2009). Design of a Questionnaire Instrument, Handbook of Mobile Technology Research Methods. Nova Publisher. p. 31-57. Bryman, A. and Cramer, D. (2011). Quantitative Data Analysis with IBM SPSS 17, 18 & 19 A Guide for Social Scientists. 1st Edition. Taylor&Francis Group, London. Campion, M.A. (1993). Article review checklist: A criterion checklist for reviewing research articles in applied psychology. Personnel Psychology, 46:705–718. https://doi.org/10.1111/j.1744-6570.1993.tb00896.x Chan, J. C. (1991). Response-Order Effects in Likert-Type Scales. Educational and Psychological Measurement, 51(3):531-540. https://doi.org/10.1177/0013164491513002. Cho, E. and Kim, S. (2014). Cronbach’s Coefficient Alpha: Well Known but Poorly Understood. Organizational Research Methods, 18(2):207-230. Doi: 10.1177/1094428114555994. Christian, L. M. and Dillman, D. A. (2004). The Influence of Graphical and Symbolic Language Manipulations on Responses to Self-Administered Questions. Public Opinion Quarterly, 68(1):57-80. https://doi.org/10.1093/poq/nfh004. Cochran, W. G. (1977). Sampling techniques. 3rd Edition, John Wiley & Sons, Inc. Cochran, W.G. (1963). Sampling Techniques. 2nd Edition., New York: John Wiles & Sons, Inc. Cole, S. T. (2005). Comparing Mail and Web-Based Survey Distribution Methods: Results of Surveys to Leisure Travel Retailers. Journal of Travel Research, 43:422-430. https://doi.org/10.1177/0047287505274655. Cortina, J. M. (1993). What Is Coefficient Alpha? An Examination of Theory and Applications. Journal of Applied Psychology, 78(1):98-104. Couper, M. (2011). The future of modes of data collection. Public Opinion Quarterly, 75(5):889-908. Couper, M. P. (2000). Web Surveys: A Review of Issues and Approaches. Public Opinion Quarterly, 64(4):464-494. https://www.jstor.org/stable/3078739. Couper, M. P. and Miller, P. V. (2008). Web Survey Methods: Introduction. Public Opinion Quarterly, 72 (1):831-835. https://doi.org/10.1093/poq/nfn066. Couper, M. P., Tourangeau, R., Conrad, F. G. and Zhang, C. (2013). The Design of Grids in Web Surveys. Social Science Computer Review, 31(3):322-345. https://doi.org/10.1177/0894439312469865. Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16 (3):297-334. Yeager, D. S., Krosnick, J. A., Chang, L.-C., Javitz, H. S., Levendusky, M. S., Simpser, A. and Wang, R. (2011). Comparing the Accuracy of RDD Telephone Surveys and Internet Surveys Conducted with Probability and Non-Probability Samples. Public Opinion Quarterly, 75(4):709-747. https://doi.org/10.1093/poq/nfr020. De Leeuw, E.D. (2001). Reducing Missing Data in Surveys: An Overview of Methods. Quality and Quantity, 35(2):147-160. Doi:10.1023/A:1010395805406. De Leeuw, E. D., Hox, J. and Huisman, M. (2003). Prevention and Treatment of Item Nonresponse. Journal of Official Statistics, 19(2):153-176. Dennis, J. M., Chatt, C., Li, R., Motta-Stanko, A. and Pulliam, P. (2005).Data collection mode effects controlling for sample origins in a panel survey: Telephone versus internet. 60th Annual Conference of the American Association. Dijkstra, W. (1987). Interviewing Style and Respondent Behavior: An Experimental Study of the Survey-Interview.16(2):309-333. Doi:10.1177/0049124187016002006. Dillman, D. A., Phelps, G., Tortora, R., Swift, K., Kohrell, J., Berck, J., and Messer, B. L. (2009). Response rate and measurement differences in mixed-mode surveys using mail, telephone, interactive voice response (IVR) and the Internet. Social Science Research, 38(1):1-18. http://dx.doi.org/10.1016/j.ssresearch.2008.03.007. Donsbach, W. and Traugott, M. W. (Eds). (2008). The SAGE Handbook of Public Opinion Research. In Dillman, D. A. and N. L. Parsons, Chapter 24: Self-Administered Paper Questionnaires. http://dx.doi.org/10.4135/9781848607910.n25. Duffy, M. (2002). Methodological Issues In Web-based Research. Journal of Nursıng Scholarship, 34(1):83-88. https://doi.org/10.1111/j.1547-5069.2002.00083.x Edwards, M. L., Dillman, D. A. and Smyth, J. D. (2014). An Experimental Test of the Effects of Survey Sponsorship on Internet and Mail Survey Response. Bureau of Sociological Research - Faculty Publications, 19:1-17. htp://digitalcommons.unl.edu/bosrfacpub/19. Erkel, P. F.A. V. and Thijssen, P. (2016). The first one wins: Distilling the primacy effect. Electoral Studies, 44 (2016):245-254. http://dx.doi.org/10.1016/j.electstud.2016.09.002. EUROSTAT. (2008). NACE Rev. 2 Statistical classification of economic activities in the European Community, https://rio.jrc.ec.europa.eu/en/library/nace-rev-2-statistical-classification-economic-activities. Field, A. (2009). Discovering Statistics Using IBM SPSS Statistics. 3rd Edition, SAGE Publications. Field, A. (2013). Discovering Statistics Using IBM SPSS Statistics. 4th Edition, SAGE Publications. Google URL Shortener. (2018). https://goo.gl/. Greener, S. (2008). Business Research Methods. Ventus Publishing ApS, http://web.ftvs.cuni.cz/hendl/metodologie/introduction-to-research-methods.pdf. Access Date: July 18, 2018. Greenlaw, C. and Brown-Welty, S. (2009). A Comparison of Web-Based and Paper-Based Survey Methods: Testing Assumptions of Survey Mode and Response Cost. Evaluation Reviews, 33(5):464-480. https://doi.org/10.1177/0193841X09340214. Groves, R. M. (2006). Nonresponse Rates and Nonresponse Bias in Household Surveys. Public Opinion Quarterly. 70(5):646-675. Doi:10.1093/poq/nfl033. Groves, R. M., Fowler, F. J., Couper, J. M. P., Lepkowski, J. M., Singer, E. and Tourangeau, R. (2004). Survey Methodology. Wiley Series in Survey Methodology. Groves, R. M., Fowler, F. J., Couper, J. M. P., Lepkowski, J. M., Singer, E. and Tourangeau, R. (2009). Survey Methodology. Wiley Series in Survey Methodology. Hacettepe University Institute of Population Studies [HUIPS]. (2014). 2013 Turkey Demografic and Health Survey. Ankara. Halis, M. , Yıldırım, M., Ozkan, B. and Ozkan, G. (2007). Assessment Oriented Research Of The Factors Effect That Foreign Entrepreneurship. Selçuk University, Karaman Faculty of Economics and Administrative Sciences Journal.12(9):304-318. Harvey, R., Bolton, P., Wilse-Samson, L., An, L. and Samama, F. (2014). Barriers to Long-Term Cross-Border Investing: A Survey of Institutional Investor Perceptions. Rotman International Journal of Pension Management, 7(2):50-59. Helms, J. E., Henze, K. T., Sass, T. L. and Mifsud, V. A. (2006). Treating Cronbach’s Alpha Reliability Coefficients as Data in Counseling Research. The Counseling Psychologist, 34(5):630-660. https://doi.org/10.1177/0011000006288308. Hertzog, M. A.(2008). Considerations in determining sample size for pilot studies. Research in Nursing & Health, 31(2):180-191. https://doi.org/10.1002/nur.20247. Hippler, H. J. and Schwarz, N. (1992). The impact of administration modes on response effects in surveys (ZUMAArbeitsbericht,1992/14). Mannheim: Zentrum für Umfragen, Methoden und Analysen -ZUMA-. https://nbnresolving.org/urn:nbn:de:0168-ssoar-69777. Hoonakker, P. and Carayon, P. (2009). Questionnaire Survey Nonresponse: A Comparison of Postal Mail and Internet Surveys. International Journal of Human–Computer Interaction, 25(5):348-373. Doi:10.1080/10447310902864951. Hope, S., Campanelli, P., Nicolaas, G., Lynn, P. and Jäckle, A. (2014). The role of the interviewer in producing mode effects: results from a mixed modes experiment comparing face-to-face, telephone and web administration. Institute for Social and Economic Research, ISER Working Paper Series 2014-20. Höglinger, M., Jann, B. and Diekmann, A. (2016). Sensitive questions in online surveys: An experimental evaluation of different implementations of the randomized response technique and the crosswise model. Survey Research Methods, 10(3):171-187. Höhne, J. K. and Lenzner, T. (2015). Investigating response order effects in web surveys using eye tracking. Psihologija, 48(4):361-377. Doi:10.2298/PSI1504361H. IBM Corp. Released 2016. IBM SPSS Statistics for Windows, Version 24.0. Armonk, NY: IBM Corp. ICF International. (2012). Survey Organization Manual for Demographic and Health Surveys. MEASURE DHS. Calverton. Maryland: ICF International. https://dhsprogram.com/pubs/pdf/DHSM10/DHS6_Survey_Org_Manual_7Dec2012_DHSM10.pdf. Access Date: July 18, 2018. Jansen, K. J., Corley, K. G. and Jansen, B. J. (2007). E-Survey Methodology. https://faculty.ist.psu.edu/jjansen/academic/pubs/esurvey_chapter_jansen.pdf. Access Date: May 6, 2017. Kanuk, L. and Berenson, C. (1975). Mail Surveys and Response Rates: A Literature Review. Journal of Marketing Research, 12: 440-53. Kaplowitz, M. D, Hadlock, T.D. and Levine, R. (2004). A Comparison of Web and Mail Survey Response Rates. Public Opinion Quarterly, 68(1):94-101 https://doi.org/10.1093/poq/nfh006. Kelley, K., Clark, B., Brown, V. and Sitzia, J. (2003). Good practice in the conduct and reporting of survey research. International Journal for Quality in Health Care, 15(3):261-266. https://doi.org/10.1093/intqhc/mzg031. Kish, L. (1965). Survey Sampling. John & Wiley Sons. Kim, S. and Lee, W. (2017). Does McNemar’s test compare the sensitivities and specificities of two diagnostic tests?. Statistical Methods in Medical Research, 26(1):142-154. https://doi.org/10.1177/0962280214541852. Kim, T. K. (2015). T test as a parametric statistic. Korean Journal of Anesthesiology, 68(6):540-546. Doi:https://doi.org/10.4097/kjae.2015.68.6.540. Kim, Y., Dykema, J., Stevenson, J., Black, P. and Moberg, D. P. (2018). Straightlining: Overview of Measurement, Comparison of Indicators, and Effects in Mail–Web Mixed-Mode Surveys. Social Science Computer Review, 20(10):1-20. Doi: 10.1177/0894439317752406. Kiselakova, D. and Kiselak, A. (2014). Analysis of Macroeconomic Factors for the Establishment of Industrial Parks and Their Effects on Regional Development: Empirical Study from Slovakia. Asian Economic and Financial Review, 4(9):1220-1236. Kocadoru, H. (2009). An Interview About Global Investment In Turkey With Administrator Of Global Company. The Journal of International Social Research, 2:252-258. Kothari, C. R. (2004). Research Methodology. 2nd, Revised Edition. New Age International (P) Ltd. 2004. Krosnick, J. A. and Alwin, F. D. (1987). An Evaluation of a Cognitive Theory of Response-Order Effects in Survey Measurement. Public Opinion Quarterly, 51(2):201-219. https://www.jstor.org/stable/2748993. Krosnick, J. A. and Alwin, D. F. (1988). A test of the form resistant correlation hypothesis: Ratings, rankings, and the measurement of values. Public Opinion Quarterly, 52(4):526-538. Retrieved from http://www.jstor.org/stable/2749259. Kwak, N. and Radler, B. (2002). A Comparison Between Mail and Web Surveys: Response Pattern, Respondent Profile, and Data Quality. Journal of Official Statistics, 18(2):257-273. Ladik, D. M., Carrillat, F. A. and Solomon, P. J. (2007). The Effectiveness of University Sponsorship in Increasing Survey Response Rate. Journal of Marketing Theory and Practice, 15(3): 263-271. http://www.jstor.org/stable/40470298. De Leeuw, E. D., Hox, J. J. and Dillman, D. A. (2008). International handbook of survey methodology. European Association of Methodology. New York:Lawrence Erlbaum Associates. Lepkowski, J. M., Tucker, C. and Brick, J. M. Edith de Leeuw, D., Japec, L., Lavrakas, P. J., Linkü, M. W. and Sangster, R. L. (Eds.). (2007). Advances in Telephone Survey Methodology. In Tucker, C. and Lepkowski, J. M. Chapter 1:Telephone Survey Methods: Adapting to Change. John Wiley & Sons, Inc. doi:10.1002/9780470173404.ch1. Liao, P.-W. and Hsieh, J.-Y. (2017). Does Internet-Based Survey Have More Stable and Unbiased Results than Paper-and-Pencil Survey?. Open Journal of Social Sciences, 5:69-86. Doi:10.4236/jss.2017.51006. Lin, W. and Ryzin, G. G. V. (2012). Web and Mail Surveys: An Experimental Comparison of Methods for Nonprofit Research. Nonprofit and Voluntary Sector Quarterly, 41(6):1014-1028. https://doi.org/10.1177/0899764011423840. Liu, M. and Cernat, A. (2018). Item-by-item versus matrix questions:A web survey experiment. Social Science Computer Review, 36(6):690-706. Doi:10.1177/0894439316674459. Loosveldt, G. and Beullens, K. (2017). Interviewer Effects on Non-Differentiation and Straightlining in the European Social Survey. Journal of Official Statistics, 33(2):409–426. http://dx.doi.org/10.1515/JOS-2017-0020. Mackety, D. M. (2007). Mail and Web Surveys: A Comparison of Demographic Characteristics and Response Quality When Respondents Self-Select The Survey Administration Mode. Western Michigan University, Doctorate Dissertation. Malhotra, N. (2008). Completion Time and Response Order Effects in Web Surveys. Public Opinion Quarterly, 72(5):914–934, https://doi.org/10.1093/poq/nfn050. Manfreda, K. L., Berzelak, J., Vehovar, V., Bosnjak, M. and Haas, I. (2008). Web Surveys versus other Survey Modes: A Meta-Analysis Comparing Response Rates. International Journal of Market Research, 50(1):79-104. https://doi.org/10.1177/147078530805000107. McAbe, S. E., Boyd, C. J., Couper, M. P., Crawford, S. and D’Arcy, H. (2002). Mode effects for collecting alcohol and other drug use data: Web and U.S. mail. Journal of Studies on Alcohol, 63(6):755-761. McCarty, J. and Shrum, L. (2000). The Measurement of Personal Values in Survey Research: A Test of Alternative Rating Procedures. The Public Opinion Quarterly, 64(3):271-298. Retrieved from http://www.jstor.org/stable/3078720. Mcpeake, J., Bateson, M. and O’neill, A. (2014). Electronic Surveys: How To Maximise Success. Nurse Researcher. 21(3):24-26. Microsoft Office 2016 (Computer software). (2018). Microsoft. McHugh, M. L. (2013). The chi-square test of independence. Biochemia medica, 23(2):143-149. doi:10.11613/BM.2013.018. Neuman, W. L. (2004).“Social Research Methods: Qualitative and Quantitative Approaches.” Pearson Education Limited. Nicolaas, G. and Tipping, S. (2006). Mode effects in social capital surveys. National Centre for Social Research, UK, Survey Methodology Bulletin, Special Edition (58):56-74. Norman, M. B, Sudman, S. and Wansink, B. (2004). Asking Questions: The Definitive Guide to Questionnaire Design -- For Market Research, Political Polls, and Social and Health Questionnaires. 2nd, Revised Edition. John Wiles & Sons. Official Gazette. (2000). Organized Industrial Zones Law (OIZ Law), Law No: 4562. Date of Official Gazette: April 15, 2000, Article 3. Ankara. Official Gazette. (2019). Organized Industrial Zones Implementation Regulation [OIZ Implementation Regulation]. Regulation No: 30674, Date of Official Gazette: February 2, 2019, Article 3. Ankara. Official Gazette. (2006). Tax Identification Number General Communique, Communique No:3. Official Gazette No:26274, Date of Official Gazette: August 29, 2006. Ankara. Perkins, J. J. and Sanson-Fisher, R. W. (1999). An Examination of Self- and Telephone-Administered Modes of Administration for the Australian SF-36. https://doi.org/10.1016/S0895-4356(98)00088-2. Peterson, R. A. (1994). A Meta-Analysis of Cronbach's Coefficient Alpha. Journal of Consumer Research, 21(2):381-391. www.jstor.org/stable/2489828. Presser, S., Couper, M. P., Lessler, J. T., Martin, E., Martin, J., Rothgeb, J. M. and Singer, E. (2004). Methods for Testing and Evaluating Survey Questions. Public Opinion Quarterly, 68(1):109-130. https://doi.org/10.1093/poq/nfh008. Preston, C. P. and Colman, A. M. (2000). Optimal number of response categories in rating scales: reliability, validity, discriminating power, and respondent preferences. Acta Psychologica, 104 (2000): 1-15. Pruchno, R. A. and Hayden, J. M. (2000). Interview Modality: Effects on Costs and Data Quality in a Sample of Older Women. Journal of Aging and Health, 12(1):3-24. https://doi.org/10.1177/089826430001200101. Quinn, G. P. and Keough, M. J. (2002). Experimental design and data analysis for biologists. Cambridge University Press. R Core Team (2018). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. Retrieved from http://www.R-project.org/. Rada, V. D. and Domínguez-Álvarez, J. A. (2014). Response Quality of Self-Administered Questionnaires: A Comparison Between Paper and Web Questionnaires. Social Science Computer Review, 32(2):256-269. https://doi.org/10.1177/0894439313508516. Republic of Turkey Ministry of Industry and Technology (MIT). (2018). Inhouse Database, Ankara. Access Date: January 30, 2018. Republic of Turkey Ministry of Industry and Technology (MIT). (2019a). Inhouse Database, Ankara. Access Date: April 15, 2019. Republic of Turkey Ministry of Industry and Technology (MIT). (2019b). Industrial Zones Directorate General, https://kss.sanayi.gov.tr, Access Date: April 24, 2019. Sadıkoglu, E. and Olcay, H. (2014). The Effects of Total Quality Management Practices on Performance and the Reasons of and the Barriers to TQM Practices in Turkey. Advances in Decision Sciences, 2014:1-17. doi:10.1155/2014/537605. Saunders, M. N. K. (2012). Web versus Mail: The Influence of Survey Distribution Mode on Employees’ Response. Field Methods, 24(1):56-73. Doi:10.1177/1525822X11419104. Schaeffer, N. and Dykema, J. (2011). Questions for surveys current trends and future directions. Public Opinion Quarterly, 75(5):909–961. Doi: 10.1093/poq/nfr048. Schonlau, M. and Toepoel, V. (2015). Straightlining in Web survey panels over time. Survey Research Methods. 9(2):125-137. Journal of the European Survey Research Association. Doi: http://dx.doi.org/10.18148/srm/2015.v9i2.6128. Shih, T. and Fan, X. (2008). Comparing Response Rates from Web and Mail Surveys: A Meta-Analysis. Field Methods, 20(3):249-271. Doi:10.1177/1525822X08317085. Shin, E., Johnson, T. P. and Rao, K. (2012). Survey Mode Effects on Data Quality: Comparison of Web and Mail Modes in a U.S. National Panel Survey. Social Science Computer Review, 30(2):212-228. https://doi.org/10.1177/0894439311404508. Smyth, J. D., Dillman, D. A. and Christian, L. M. (2012). Context effects in Internet surveys: New issues and evidence. In Oxford Handbook of Internet Psychology Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199561803.013.0027. Som, R. K. (1996). Practical Sampling Techniques. 2nd, Revised and Expanded Edition, Marcel Dekker, Inc., Newyork. Statistics Canada (STATCAN). (2010). Survey Methods and Practices. Ottawa, ISBN 978-1-100-16410-6. www.statcan.gc.ca/pub/12-587-x/12-587-x2003001-eng.pdf. Access Date: May 27, 2018. Streiner, D. L. (2003). Starting at the Beginning: An Introduction to Coefficient Alpha and Internal Consistency. Journal of Personality Assessment, 80(1):99-103. Doi: 10.1207/S15327752JPA8001_18. Sudman, S. and Bradburn, N. M. (1987). The Organizational Growth of Public Opinion Research in the United States. Public Opinion Quarterly, 51:67-78. http://www.jstor.org/stable/2749188. The National Science Foundation (NSF). (2016). Business R&D and Innovation Survey (BRDIS).https://www.nsf.gov/statistics/srvymicrobus/surveys/srvymicrobus-2016.pdf. Access Date: May 4, 2018. Tourangeau, R. and Smith, T. (1996). Asking Sensitive Questions: The Impact of Data Collection Mode, Question Format, and Question Context. Public Opinion Quarterly, 60(2):275-304. Retrieved from http://www.jstor.org/stable/2749691. Truell, A.D., Bartlett, J.E. and Alexander, M. W. (2002). Response rate, speed, and completeness: A comparison of Internet-based and mail surveys. Behavior Research Methods, Instruments, & Computers, 34(1):46-49. Doi:10.3758/BF03195422. Turkish Enterprise and Business Confederation [TÜRKONFED] and Özyeğin University.(2009). Access to Finance in SMEs. Istanbul. Ugoni, A. and Walker, B. F. (1995). The t Test: An Introduction. Comsig Review, 4(2):37-40. https://www.researchgate.net/publication/25752723. Willimack, D. K. and Nichols, E. (2010). A hybrid response process model for business surveys. Journal of Official Statistics, 26(1):3-24. Yan, T. and Curtin, R. (2010). The Relation Between Unit Nonresponse and Item Nonresponse: A Response Continuum Perspective. International Journal of Public Opinion Research, 22(49):535–551. https://doi.org/10.1093/ijpor/edq037. Yang, X., Wang, Z., Chen,Y. and Yuan, F. (2011). Factors Affecting Firm-Level Investment and Performance in Border Economic Zones and Implications for Developing Cross-Border Economic Zones between the People’s Republic of China and its Neighboring GMS Countries. Asian Development Bank, 1(1):1-66. Yang, Y. and Green, S.B. (2011). Coefficient alpha: a reliability coefficient for the 21st century?. Journal of Psychoeducational Assessment. 29(4):377-392. https://doi.org/10.1177/0734282911406668. Yemen Polling Center. (2006). Investment Obstacles In Yemen. www.yemenpoling.org. Access Date: May 6, 2017. Yıldız, S. and Ayyıdız, Y. (2008). The Barriers In Front of Direct Foreign Capital Investments from Turkey to Kyrgyzstan (Field Research). Second International Social Scientific Congress. Bishkek, Kyrgyzstan. p.798-823. Zhang, C. (2013). Satisficing in web surveys: implications for data quality and strategies for reduction. University of Michigan. PhD Dissertation.tr_TR
dc.identifier.urihttp://hdl.handle.net/11655/7092
dc.description.abstractAlthough there are studies on the comparison of mail and web survey modes on different populations in terms of data quality indicators in the literature, it is seen that there are very few studies on comparing these two modes on firms. On the other hand, there is no study to compare mail and web survey modes in national literature. At the same time, no study has been encountered in the international literature to compare these two data collection methods in the context of organized industrial zone (OIZ). The purpose of this study is to compare mail and web survey modes in terms of data quality indicators and response rates on the firms in the production stage in OIZs. The research consists of two stages and two different questionnaires, both of which included partly repetitive questions, were applied in internet and mail mode. The number of firms responding to the 1st stage questionnaire of the survey is 847, while the number of firms answering the 2nd stage questionnaire of the survey is 343. As a result of the analyzes, when web survey mode and mail survey mode are compared in the context of response rates, it is seen that web survey mode had a higher response rate. Primacy effect, although not for all, are higher for some questions in web survey mode. In general, the consistency of the answers to repeated questions is high and does not differ by survey mode. Straightlining does not vary by survey mode in general. On the other hand, item nonresponse in mail questionnaires does not differ according to the survey stages. When the findings were evaluated in a holistic approach, it was observed that survey modes did not show a large difference in terms of data quality on the firms in OIZs in the context of Turkey. The findings obtained from this study are generalizable because firms are determined by random sampling. The results are expected to make a significant contribution to both national and international literature in terms of comparing data quality in OIZs. In addition, this study has a special importance because it carries characteristics of establishment survey. Findings and fieldwork determinations indicate that use of web survey mode will be useful in future research for OIZs in Turkey.tr_TR
dc.description.tableofcontentsTABLE OF CONTENTS ACKNOWLEDGEMENTS i ÖZET ii ABSTRACT iii TABLE OF CONTENTS iv LIST OF TABLES vi LIST OF FIGURES vii LIST OF ABBREVIATIONS viii CHAPTER 1. INTRODUCTION 1 1.1. Researh Questions 3 CHAPTER 2. OVERVIEW OF RELATED CONCEPTS AND LITERATURE REVIEW 5 2.1. Survey Process 5 2.2. Concise History of Survey Modes 13 2.3. Theoretical Framework and Selected Indicators 16 2.3.1. Survey Errors 16 2.3.2. Response Rate 18 2.3.3. Primacy Effect 19 2.3.4. Item Nonresponse 20 2.3.5. Internal Consistency 20 2.3.6. Straightlining 20 2.4. Literature Review on Comparison of Mail and Web Surveys in Terms of Data Quality Indicators and Response Rate 21 2.5. Hypotheses 23 CHAPTER 3. METHODOLOGY 26 3.1. Survey Design 26 3.1.1. Sampling Frame 26 3.1.2. Sample Selection 31 3.1.3. Questionnaire Design 35 3.1.4. Pre-test 39 3.1.5. Pilot Test 40 3.1.6. Fieldwork 42 3.1.7. Data Processing and Analysis 45 3.2. Calculation of Data Quality Indicators and Response Rate 45 3.2.1. Response Rate 45 3.2.2. Primacy Effect 48 3.2.3. Item Nonresponse 48 3.2.4. Internal Consistency 48 3.2.5. Straightlining 49 3.2.6. Methods of Statistical Analysis 50 3.3. Data Quality Variables for Analysis 51 CHAPTER 4. RESULTS 54 4.1. Comparative Results of Data Quality Indicators and Response Rate in Web and Mail Surveys 54 4.1.1. Respondent Characteristics 54 4.1.2. Response Rate 60 4.1.3. Primacy Effect 67 4.1.4. Item Nonresponse 73 4.1.5. Internal Consistency 76 4.1.6. Straightlining 82 CHAPTER 5. DISCUSSION AND CONCLUSION 85 REFERENCES 94 APPENDIX APPENDIX A. UNIVERSITY ETHICS COMMITTEE APPROVAL 102 APPENDIX B. RESEARCH PERMISSION ISSUED BY MIT 103 APPENDIX C. ADVANCE LETTER EXAMPLE, STAGE 1 104 APPENDIX D. COVER LETTER FOR WEB QUESTIONNAIRE, STAGE 1 105 APPENDIX E. COVER LETTER FOR MAIL QUESTIONNAIRE, STAGE 1 106 APPENDIX F. COVER LETTER FOR WEB QUESTIONNAIRE, STAGE 2 107 APPENDIX G. COVER LETTER FOR MAIL QUESTIONNAIRE, STAGE 2 108 APPENDIX H. MAIL QUESTIONNAIRE, STAGE 1 109 APPENDIX I. MAIL QUESTIONNAIRE, STAGE 2 122 APPENDIX J. RESPONDENTS BY OCCUPATIONAL POSITION AND RESPONSE RATES, STAGE 1 AND STAGE 2 134 APPENDIX K. ITEM-BASED DESCRIPTIVE STATISTICS OF PRIMACY EFFECT BY SURVEY MODE 138 APPENDIX L. SCALE STATISTICS BY SURVEY MODE SWITCHES 140 APPENDIX M. STRAIGHTLINING MEASURES BY SURVEY MODE AND SURVEY STAGE 142 APPENDIX N. ANALYSIS VARIABLES OF THE 1ST STAGE QUESTIONNAIRE 143 APPENDIX O. ANALYSIS VARIABLES OF THE 2ND STAGE QUESTIONNAIRE 152 LIST OF TABLES Table 3.1.1.1. Postal Address Status of Firms by OIZ Type 28 Table 3.1.1.2. Number of Firms Remaining After the Removal of Duplicate Records 30 Table 3.1.1.3. Number of Firms with E-mail Addresses 30 Table 3.1.2.1. Response Rates (RRs) Expected by Survey Stage and Survey Mode 32 Table 3.1.2.2. Number of Respondents Expected by Survey Stage 34 Table 3.1.5.1. Pilot Test Target Sample Size by Survey Stage and Survey Mode 41 Table 3.2.1.1. Final Disposition Codes for Mail and Web Surveys 46 Table 4.1.1.1. Respondent Characteristics by Survey Mode, Stage 1 55 Table 4.1.1.2. Respondent Characteristics by Survey Mode, Stage 2 57 Table 4.1.1.3. Respondents by Frequencies of Occupational Position, Stage 1 and Stage 2 59 Table 4.1.1.4. Respondents by Survey Mode and Stratum, Stage 1 and Stage 2 60 Table 4.1.2.1. Disposition Codes by Stratum, Stage 1 61 Table 4.1.2.2. Response Rates (RRs) by Stratum, Stage 1 61 Table 4.1.2.3. Disposition Codes by Survey Mode, Stage 1 62 Table 4.1.2.4. Comparison of Response Rates (RRs) by Survey Mode, Stage 1 63 Table 4.1.2.5. Disposition Codes by Stratum, Stage 2 64 Table 4.1.2.6. Response Rates (RRs) by Stratum, Stage 2 64 Table 4.1.2.7. Disposition Codes by Survey Mode, Stage 2 65 Table 4.1.2.8. Comparison of Response Rates (RRs) by Survey Mode, Stage 2 66 Table 4.1.2.9. Disposition Codes by Survey Mode Switches, Stage 2 67 Table 4.1.2.10. Response Rates (RRs) by Survey Mode Switches, Stage 2 67 Table 4.1.3.1. Item-based Comparison of Primacy Effect by Survey Mode, Stage 1 70 Table 4.1.3.2. Item-based Comparison of Primacy Effect by Survey Mode, Stage 2 72 Table 4.1.4.1. Level of Item Nonresponse by Stratum 73 Table 4.1.4.2. Item-based Comparison of Item Nonresponse by Survey Stage 75 Table 4.1.5.1. Coefficient Alfa Values by Survey Mode Switch 77 Table 4.1.5.2. Item-based Comparison of Internal Consistency by Survey Mode Switch 80 Table 4.1.5.3. Survey Mode Comparisons for the “Current market volume of the predominant sector in OIZ” Question 82 Table 4.1.6.1. Item-based Comparison of Straightlining by Survey Mode and Survey Stage. 84 Table 5.1. Summary of Findings by Hypothesis 89tr_TR
dc.language.isoentr_TR
dc.publisherNüfus Etütleri Enstitüsütr_TR
dc.rightsinfo:eu-repo/semantics/openAccesstr_TR
dc.subjectData qualitytr_TR
dc.subjectSurvey modetr_TR
dc.subjectTotal survey errortr_TR
dc.subjectResponse ratetr_TR
dc.titleComparison of Mail and Web Survey Modes on Firms in Organized Industrial Zones (Oızs)tr_en
dc.typeinfo:eu-repo/semantics/masterThesistr_TR
dc.description.ozetLiteratürde farklı popülasyonlar üzerinde internet ve posta veri toplama modlarının veri kalitesi göstergeleri açısından karşılaştırılmasına ilişkin çalışmalar olsa da firmalar üzerinde bu iki modun karşılaştırılmasına yönelik çalışmaların çok az olduğu görülmektedir. Diğer taraftan, ulusal yazında internet ve posta veri toplama yaklaşımlarının karşılaştırılmasına yönelik herhangi bir çalışma bulunmamaktadır. Aynı zamanda, organize sanayi bölgesi (OSB) özelinde bu iki veri toplama yönteminin karşılaştırılmasına yönelik uluslararası literatürde herhangi bir çalışmayla karşılaşılmamıştır. Bu çalışmanın amacı, Türkiyedeki OSB’lerde üretim aşamasında bulunan firmalar üzerinde internet ve posta veri toplama modlarını veri kalitesi göstergeleri ve cevaplılık oranları açısından karşılaştırmaktır. Araştırma iki aşamadan oluşmakta olup, her iki aşamada içerisinde kısmen tekrarlı sorular olan iki farklı anket internet ve posta modunda uygulanmıştır. Araştırmanın 1. aşama anketini cevaplayan firma sayısı 847, ikinci aşama anketini cevaplayan firma sayısı ise 343’tür. Analizler sonucunda, internet ve posta veri toplama modu cevaplılık oranları bağlamında karşılaştırıldığında internet veri toplama modunun daha yüksek bir cevaplılık oranına sahip olduğu görülmüştür. İlk cevap seçeneğinin seçilme durumu, bütün sorularda olmasa da bazı sorularda internet modunda daha yüksektir. Genel olarak tekrarlanan sorulara verilen cevapların tutarlılığı yüksektir ve veri toplama moduna göre farklılık göstermemektedir. Cevap seçeneklerinin peşi sıra işaretlenme durumu, genel olarak survey moduna göre farklılık göstermemektedir. Posta anketlerinde soru bazında cevapsızlık ise araştırma aşamalarına göre farklılık göstermemektedir. Elde edilen bulgular bütüncül bir yaklaşımla değerlendirildiğinde, Türkiye bağlamında OSB’lerdeki firmalar üzerinde veri kalitesinin veri toplama moduna göre büyük bir farklılık göstermediği gözlenmiştir. Sonuç olarak, tesadüfi örnekleme yapılarak firmalar belirlendiği için bu çalışmadan hedef nüfusa dair elde edilen bulgular genellenebilir niteliktedir. Veri kalitesinin OSB’ler özelinde karşılaştırılmasından elde edilen sonuçların, hem ulusal hem de uluslararası yazına önemli bir katkı sunması beklenmektedir. Bunun yanında, yapılan çalışma kuruluş araştırması niteliği taşıdığı için ayrı bir öneme sahiptir. Bulgular ve alan çalışması tespitleri, Türkiyedeki OSB’ler için gelecekte yapılacak araştırmalarda internet veri toplama modunun kullanılmasının yararlı olacağını göstermektedir.tr_TR
dc.contributor.departmentSağlık Yönetimitr_TR
dc.embargo.termsAcik erisimtr_TR
dc.embargo.lift2019-05-10T07:28:05Z


Bu öğenin dosyaları:

Bu öğe aşağıdaki koleksiyon(lar)da görünmektedir.

Basit öğe kaydını göster