5.4 Decolonizing the Canadian Workplace
Tricia Nicola Hylton
“Colonialization as a system [did] not go away, it remains”
(Decolonisation, 2023)
One of the many misconceptions about colonization is that it ended when colonizing powers departed occupied lands. However, in Decolonisation (2023), Dr. Ndlovu-Gatsheni explained that the departure of colonial powers from occupied lands only signified the “end of colonization as an event” (2:45), not of colonialism as a system. Colonialism was not simply the occupation of land and the stealing of resources. Colonialism also involved instituting the colonial worldview: the process of replacing the values, beliefs, economic, political, and educational systems of Indigenous populations with those of the colonizing nation (Deconlonisation, 2023; Wilson & Hodgson, 2017). Dr. Ebalaroza-Tunnell (2024) states, “Colonialism wasn’t just about physical land grabs; it also imposed dominant ideologies and systems of knowledge…that prioritized Eurocentric viewpoints, hierarchical structures, and an “extractive” approach to work” (para. 7).
For persons and nations once colonized, the colonial worldview has resulted in many detrimental effects. The Office of the High Commissioner for Human Rights (n.d.) and the United Nations High Commission (2023) argue that the cause of modern-day discrimination, inequality, and injustice is rooted in colonialism. Specifically, the Office of the High Commissioner for Human Rights (n.d.) states that a direct connection exists between colonialism and
- contemporary forms of racism, racial discrimination, and xenophobia, and
- intolerance faced by Africans, people of African descent, people of Asian descent, and Indigenous Peoples.
In addition, The United Nations High Commission (2023) adds the following groups to those negatively impacted by the colonial worldview:
- persons of diverse sexual orientations,
- persons of diverse gender identities and expressions and/or sex characteristics, and
- women and children.
This understanding of colonization is important when discussing decolonization. Introduced in Unit 5.1: Equity, Diversity, and Inclusion: Terminology, decolonization involves a very intricate and complex process of breaking down the ideologies, systems, and structures established by the colonial worldview, and normalized over time, to establish new ideologies, systems, and structures (Decolonisation, 2023). This Unit does not propose to examine the full breadth of issues associated with decolonization. Instead, this Unit will take a specific look at decolonization as a strategy to deconstruct the ideologies, systems, and structures that exist in the Canadian work environment.
The Colonial Worldview and Its Impact
We’ll begin our discussion by examining the characteristics of the colonial worldview and how these characteristics materialize in the Canadian work environment. Common characteristics of the colonial worldview include:
- Valuing individualism and competition and seeking success for oneself above all else,
- Believing the environment and its resources can be owned and exploited for the accumulation of wealth and personal enjoyment,
- Instituting a hierarchical order in society that declares some (e.g. men) as more important, powerful, and influential than others (e.g. women),
- Placing importance on the accumulation of material things as a symbol of status and power, and
- Acknowledging only one correct view of the world.
(Intercultural Communications, n.d.; Wilson & Hodgson, 2018; Indigenous Corporate Training, Inc. 2016; Ermine, 2007).
Case Study 1 provides two examples of how these characteristics are experienced in the Canadian workplace by members of racialized and marginalized groups.
Case Study 1: The Canadian Public Service
According to the Canadian Encyclopedia (2013), the Canadian Public Service supports the development and administration of government services such as healthcare, national defence, and justice. Those within the Public Service act as agents of the presiding government, elected officials, and governmental departments. In this role, the Public Service is a reflection and representative of Canadian values. The following two examples detail the experience of two racialized and marginalized groups employed in Canada’s Public Service.
Example 1:
The Black Executives Network (BEN), a support group of Black executives working in the federal Public Service, released a report in November 2024 based on interviews with 100 current and former Black Public Service executives. The report detailed the experience of systemic racism endured by former and current members of the BEN in the Public Service. Some of the more disturbing findings in the report include:
- being called the N-word
- being threatened with physical violence,
- being denied career advancement opportunities,
- enduring instances of workplace harassment and intimidation,
- encountering threats of reputational harm, and
- having the merit of professional credentials, experience, and position questioned and/or undermined.
(CBCNews, 2024)
Example 2:
The report, Many Voices One Mind: A Pathway to Reconciliation (2017), details the experiences, barriers, and challenges faced by Indigenous Peoples working in the Public Service. Based on 2,100 responses from current and former Indigenous Public Service employees, the following are some the more important challenges and barriers encountered:
- undervaluing of experience and expertise,
being denied career advancement and promotion opportunities,
- experiencing unfair and inaccessible hiring practices,
- enduring general harassment and discrimination practices, and
- combatting lack of respect for Indigenous cultures.
Of note, the report also found that Indigenous employees reported significantly higher levels of harassment and discrimination than non-Indigenous employees and that only 3.7% of senior positions in the Canadian Public Service were held by an Indigenous Person (Many Voices One Mind: A Pathway to Reconciliation, 2017).
The experiences of Black executives and Indigenous employees in Canada’s Public Service are disturbing, and should be. The importance of these negative interactions occurring in Canada’s largest single employer (CBCNews, 2024) cannot be overstated. However, as untenable as these practices are, the question that must be asked and answered is do these practices embody the characteristics of the colonial worldview?
One measurement to answer this question is to assess the noted negative behaviours towards Black executives and Indigenous Public Service employees against the characteristics of the colonial worldview described above (Intercultural Communications, n.d.; Wilson & Hodgson, 2018; Indigenous Corporate Training, Inc. 2016). An analysis of Figure 14.1 reveals that the treatment of Black executives and Indigenous employees fall under three categories of the colonial worldview: adherence to a singular worldview, acceptance of the notion that some are superior to others, and validation of individualism and competition.

A second measurement to answer the question: do the practices towards Black executives and Indigenous Public Service employees embody the characteristics of the colonial worldview was introduced in Unit 5.2: EDI in the Canadian Workplace. There, Williams (2020) offers the finding of a 10-year study that identified five patterns of bias experienced in everyday business interactions: the tightrope bias, the maternal wall bias, the racial stereotype bias, the tug-of-war bias, and the prove-it-again bias . An examination of Figure 5.4.2 reveals the treatment of Black executives and Indigenous Public Service employees reflects four of the biases identified: the prove-it-again bias, racial stereotype bias, the tug-of-war bias, and the tightrope bias.

Although Case Study 1 highlights the actions of those within Canada’s Public Service, there is a case to be made for the broad scale existence of the colonial worldview in business structures and systems across this country. Ironically, the case is made by the existence of equity, diversity, and inclusion (EDI) policies. As we will discuss, there would be no need for EDI policies and initiatives if current business systems and structures were not rooted in the colonial worldview.
Exercise: Reader Reflection
Use the template below to plot the practices experienced or observed in a past or current workplace against the characteristics of the colonial worldview and the five patterns of everyday business biases.
EDI and the Colonial Worldview
Figures 5.4.1 and 5.4.2 reveal that the colonial worldview is alive and well in Canada’s largest employer. The harmful behaviours discussed in Case Study 1 demonstrate what scholars refer to as the the invisible, broadly accepted, and unchallenged existence of colonial values and beliefs (Decolonisation, 2023; Ermine, 2007; Sue, 2021; Sue et al., 2020) that continue to inform, influence, and affect everyday interactions. Ermine (2007) explains the phenomenon this way:
This notion of universality remains simmering, unchecked, enfolded as it is, in the subconscious of the masses and recreated from the archives of knowledge and systems, rules and values of colonialism that in turn wills into being the intellectual, political, economic, cultural, and social systems and institutions of this country (p. 198).
In the business context, policies introduced to remedy the adverse impacts of the colonial worldview are classified under the banner of EDI. Golden (2024) notes, the genesis of EDI policies is rooted in “a time when societal movements and legal changes began to reshape the corporate world” (para. 2) to balance recognized inequities in business structures and systems. Taking shape during the 1960s (Golden, 2024), initially EDI policies focused on redressing racial and gender inequality. However, by the 1990s, EDI policies “began to recognize and address the diverse needs of various identity groups, including ethnic, religious, and LGBTQ+ communities” (Golden, 2024, para. 11).
Today, 60+ years after the first EDI policies were introduced (Golden, 2024), as demonstrated by Case Study 1, the inequities that existed in business structures and systems are not yet balanced. In fact, several recent reports on the state of EDI in business reveal that, both domestically and internationally, businesses continue to support existing EDI policies and plan to increase their financial commitments to EDI policies in coming years as a strategy to address continued and ongoing inequities in the workplace (Benefits Canada, 2023; Carufel, 2024; Workday, 2024). The following statistics and Figure 5.4.3 illustrate this finding:
-
Figure 5.4.3 reveals the perspective of business leaders on the importance of EDI initiatives (Carufel, 2024). In 2023, 72% of Canadian business leaders increased financial support of EDI policies and initiatives (Benefits Canada, 2023).
- In 2024, a survey of 2,600 global business leaders found that 78% believed EDI policies had risen in importance and planned on increasing their organization’s EDI budget (Workday, 2024).
- In 2024, a survey of over 400 executives in mid-to-large U.S. companies reported that
- 79% of respondents confirmed the existence of EDI policies at their firm,
- 78% of respondents planned on increasing the support for EDI policies over the next two years, and
- 73% of respondents believed the importance of EDI policies had grown over the last five years.
(Carufel, 2024).
These statistics, among others, are an indication that the invisible and broadly accepted ideologies of the colonial worldview (Decolonisation, 2023; Ermine, 2007; Sue, 2021; Sue et al., 2020) are present in our modern workspaces. A problem that does not exist, does not need a solution. Workspaces that are diverse, equitable, and inclusive do not need EDI policies to balance the playing field. The continued and growing need of EDI policies confirms the broad scale influence of the colonial worldview in business systems and structures and the continued and growing need for strategies to counteract and temper that influence.
What can be done to decolonize ‘workplaces into spaces where knowledge is democratized, power is shared, and equity thrives” (Ebalaroza-Tunnell, 2024, para. 2)? The remainder of this Unit will examine this question from a structural and individual perspective.
Decolonizing Business Systems and Structures: An Indigenous Perspective
Can equity, diversity, and inclusion truly happen within business systems and structures founded on hierarchy, dominance, exploitation, and authority? Can a system meant to benefit some and not others create an inclusive playing field where everyone can thrive? Williams (2020) answers these questions with a resounding no: “To address structural [inequity], you need to change structures…we need to fix the business system” (1:10).
Changing society’s structures and systems will require a massive, intentional, and sustained ideological shift away from the established values and beliefs of the colonial worldview towards the values and beliefs of an alternative worldview. This ideological shift, at its essence, is the process of decolonization (Decolonisation, 2023). Ermine (2007) writes that in order “to redesign social systems[,] we need…[to] accept that a monoculture with a claim to one model of humanity and one model of society” (p. 198) is a fallacy. Such an ideological shift would reshape social systems to reflect the beliefs and values of the alternative worldview, and specifically for the purpose of this discussion, reshape business systems and structures in a similar manner.
According to Kouri-Towe and Martel-Perry (2024) decolonization can be examined through two alternative worldviews: the Indigenous and Afro-centric worldviews. In addition, decolonization can also be examined through a number of individual approaches, including gender equality, sexual equality, racial equality, and environmental equality. This Unit will examine the process and impacts of decolonization through the lens of the Indigenous worldview. The University of Alberta (n.d.) explains the Indigenous worldview this way:
Indigenous Ways of Knowing are based on the idea that individuals are trained to understand their environment according to teachings found in stories. These teachings are developed specifically to describe the collective lived experiences and date back thousands of years. The collective experience is made-up of thousands of individual experiences. And these experiences come directly from the land and help shape the codes of conduct for indigenous societies. A key principle is to live in balance and maintain peaceful internal and external relations. This is linked to the understanding that we are all connected to each other. The hierarchical structure of Western world views that places humans on the top of the pyramid does not exist. The interdependency with all things promotes a sense of responsibility and accountability. The people would respond to the ecological rhythms and patterns of the land in order to live in harmony. (00:30).
Continue learning about the Indigenous worldview and how it differs from the colonial worldview by viewing the full video, World View.
Rose (2021) suggests that in many ways, the Indigenous worldview offers an opposing philosophy on how to interact with and value the world we live in. The author offers that integral to the Indigenous worldview is the understanding that “all living things are connected…contribute to the circle of life equally and should be acknowledged and respected as such” (3:27). Review figure 5.4.4 for differences in characteristics between the Indigenous worldview to those of the colonial worldview.

Figure 5.4.4 illustrates quite a few differences in the belief and value systems of the colonial and Indigenous worldviews. Of note are
- a shift away from a singular worldview to a worldview that is inclusive of diverse experiences and voices,
- a shift away from superiority of some over others to equality and balance for all,
- a shift away from competition to collaboration and cooperation,
- a shift away from individualism to collectivism and community, and
- a shift away from exploiting resources to honouring and safeguarding them.
These shifts are not superficial. They suggest a fundamental rewiring of the way we think about and interact with each other and our environment. Central to our conversation is the possibility of applying the Indigenous worldview as a vehicle to decolonize the workplace. Let’s examine this possibility further.
Equality: The Indigenous worldview does not include a hierarchical structure that places one race over another, men over women, or humans over nature. Thus, the Indigenous worldview embedded with the philosophy of equality could create work environments that
- respect the skills and contributions of all employees equally,
- are stewards of the land, animals, and other natural resources, and
- welcome the skills and contribution of all employees at all levels of operation.
Variation: The Indigenous worldview acknowledges and welcomes the diversity that exists within humanity. The Indigenous worldview embedded with the philosophy of variation could result in work environments that promote
- innovative, collaborative, and democratic, and
- empathetic, supportive, and inclusive
Community/interconnectedness: Being in good relations with your community is at the heart of the Indigenous worldview (Rose, 2021). A work culture embedded with this value could result in work environments that
- do not exploit workers,
- ensure a basic living wage,
- provide safe work environments,
- support a work-life balance, and
- employ sustainable business practices.
EXERCISE: Reader Reflection
The list presented above is not exhaustive as the Indigenous worldview would result in many other changes to work environments. Share your thoughts on what other changes you believe would result from the establishment of the Indigenous worldview in the workplace.
Another measure by which the Indigenous worldview can be assessed to evaluate its potential to decolonize the systems and structures that make up our work environment is the degree to which it avoids or prevents the five patterns of biases experienced in everyday business interactions (Williams, 2020). Table 5.4.1 presents each bias, its description, and how the Indigenous worldview is likely to impact their occurrence in the workplace.
Table 5.4.1 How Indigenous Worldview Can Change Five Types of Bias

EDI and the Indigenous Worldview
An important observation when reviewing the potential results of implementing the Indigenous worldview in business structures and systems is the organic occurrence of behaviours and practices that are typically considered the outcome of EDI policies and initiatives. Aspects of equity, diversity, and inclusion are not planned, legislated, or imposed under the Indigenous worldview. Instead, practices that are currently the result of EDI policies naturally occur, directly connect, and are enmeshed with the Indigenous worldview.
The possibilities above of an alternative way of doing and thinking in the many workspaces across this country is not imagery or hypothetical. Work environments that reflect the principles of the Indigenous worldview already exist. The following case study of Sanala Planning, an Indigenous owned and run business, is one example of the Indigenous worldview in action.
Case Study 2: Sanala Planning
Originally Alderhill Planning, Sanala Planning is based in Kamloops, BC. and is owned and run by Jessie Hemphill, a member of the Sqilxw People (Sanala, n.d.). Founded in 2016, the company’s mandate is to “work with governments, businesses and First Nations communities to support organizational development, providing everything from trauma-informed facilitation to comprehensive community plans, to workshops on reconciliation and decolonization” (Kilawna, 2022, para. 4). Through community consultations, a bottom-up approach was utilized to ensure the company’s business plan, mission, and activities aligned with the needs and wants of the communities served by the organization.
A first business practice to be highlighted here is one that is not typically associated with the business environment: empathy. Hemphill says this trait contributes to “a really safe [and] nurturing place to work” (Kilawna, 2022, para. 30). Allowing staff members to breast feed during Zoom meetings, designating office space for spiritual practice, and giving employees the flexibility to openly acknowledge and share mental health challenges (Kilawna, 2022) are some of practices that support this trait. Other alternative business practices employed by Sanala Planning include
- a non-hierarchical and inclusive decision-making process with employees,
- a shortened work week (4 days/week), and
- reduced daily work hours.
(Kilawna, 2022).
Many aspects of the Indigenous worldview can be observed in the creation and day-to-day business practices of Sanala Planning: community engagement, collaboration, work-life balance, empathy, and a democratic power structure. Of significance is the understanding that these practices have not undermined Salana Planning’s ability to be financially successful. According to Kilawna (2022), Sanala Planning has a market value of $2 million.
Although, businesses like Sanala Planning exist, businesses established to reflect the Indigenous worldview are not commonplace. However, opportunity exists to integrate Indigenous values and beliefs into business systems and structures. Ermine (2007), Fielding News (2022), and Egale Canada (2024) offer the following strategies to accomplish this goal:
- Do your research to continue learning the histories, teachings, practices, beliefs and values of Indigenous Peoples.
- Challenge your own understanding and acceptance of current business systems and structures. Do the hard work necessary to bring to light many of the unchallenged values and beliefs that form the basis of current business systems and structures.
- Advocate for the Indigenization of workplace practices by using your knowledge of the Indigenous worldview to promote an ideological shift in workplace practices.
- Promote the establishment of physical spaces that create communal connection and spiritual awareness.
Exercise: Reader Reflection
Compare the business practices of Sanala Planning to an organization/business that you currently work for or previously worked for. Reflect on the differences and similarities between the two organizations. How do you think these differences and similarities affect or affected the work environment?
Decolonizing Organizational Structure: Individual Action
Systemic and structural change in the workplace will undoubtedly take time. In the meantime, there is much you can do on an individual level to decolonize your current and future work environments for all equity seeking groups.
1. Allyship
The concept of allyship was introduced in Unit 5.1: Equity, Diversity, and Inclusion: Terminology. There, an ally is defined as a person who supports equity seeking persons or groups through action by interrupting behaviours and situations counter to fairness and equality (para. 34). Working with this definition, Ravishankar (2023) offers the following strategies to help recognize and disrupt patterns of inequity in the workspace:
- Learn about the stereotypes, experiences, and identities of the marginalized group you would like to support. Here the author suggests contacting different organizations that advocate for the rights of marginalized groups. If you are interested in supporting one particular group or notice specific harmful behaviours in your workplace that you would like to address, speak directly to those affected to understand how best to lend your support.
- Unlearn your own personal biases. We all carry some biases, many of which may be unconscious. Biases are products of our worldview. Reflect on your own behaviours and thoughts to make the unconscious, conscious. When you better understand the bias you hold, you can address them and begin your journey towards allyship. Review Figure 5.4.5 for a model of how to work your way from being unaware of your own biases towards conscious awareness and action.
- Voice your opposition to instances of microaggressions, discrimination, racism, sexism, homophobia, ableism, etc. when you witness these behaviours in the work environment. The author notes that saying something “as simple as “Not cool,” or “That’s not funny”” (para. 17) is suitable when you encounter such behaviours at work.
- Model the behaviour you would like to see in your workplace. By treating people appropriately, you can provide an example for others of how to interact respectfully with diverse members or your work environment. The author suggests that the use of inclusive language (see 3.2) is one way to model behaviour of how to treat others respectfully.

The publication, Skoden (2022), includes the Ally Bill of Responsibilities developed by Dr. Lynn Gehl. Developed from an Indigenous perspective, the Ally Bill of Responsibilities provides an extensive list of what it means to be a responsible ally.
2. Be Curious, Take Accountability, and Unlearn to Relearn
In the TedX Talk, Deconlonizing the Workforce (2024), the speaker, Toni Lowe, offers three tactics that can further help decolonize the workforce.
Strategy 1: Be curious. Lowe (2024) offers that when certain dynamics occur in the workplace, you have a responsibility to investigate the reason for the observed behaviour, thought, or action. For example, Case Study 1 revealed that Indigenous Public Service employees were asked to complete IQ tests that non-Indigenous employees were not (Many Voices One Mind: A Pathway to Reconciliation, 2017). Thus, when similar situations occur in your workplace, Lowe (2024) encourages you to become curious and ask questions like:
- What is happening?
- Why is this happening?
- What belief forms the basis for this action?
- Do I believe this?
- Is this true?
- What else do I believe that may not be true?
The answers to questions like these, the author suggests, may lead to a new awareness of inequities in your work environment.
Step 2: Take accountability. Reflect on your role in the current system. Consider if your “silence, complacency, or advantage” (Lowe, 2024) props-up or confronts the status quo. The author explains that reflecting on your role in the system is the first step towards taking accountability for your actions or inactions, acknowledging the lived experience of racialized and marginalized groups, and embracing your power to influence your environment.
Step 3: Unlearn to Re-learn. In agreement with Ravishankar (2023), Lowe (2024) also imparts the importance of recognizing and addressing unconscious biases. The author argues that in doing so, you will be able to clearly see your choice to stop participating in and supporting an outdated worldview that no longer makes sense for a modern world (Lowe, 2024).
3. Believe Lived Experiences. A simple but effective strategy to decolonize the workplace is based on a study on the effects of microaggression in the work environment. Sue (2021) concludes that an important aspect of decolonizing the work environment is for members of dominant groups to acknowledge and believe the lived experiences of maltreatment endured by racialized and marginalized colleagues. Although, the experience of racialized and marginalized groups is often very different from the experience of members of the dominant groups in the same work environment, the author encourages members of dominant groups to not ignore, deny, doubt, discredit, or minimize these lived experiences (Sue, 2021). Instead, Sue et al. (2019) directs individuals to
- Reaffirm and validate the lived experience of racialized and marginalized groups by believing their experiences,
- Be vocal and visible in your support for the fair treatment of every employee,
- Be vocal and visible in your opposition to the maltreatment of any employee, and
- Seek support from and mobilize with like minded employees to develop better workplace practices and policies.
Exercise: Reader Reflection
- Which of the individual decolonizing strategies above are you most comfortable integrating into your current and/or future workplace? Explain why.
- Can you think of other strategies that could also decolonize the workplace.
Colonialism and the colonial worldview are not issues of the past. Instead, the colonial worldview continues to affect all aspect of Canadian society, including our workspaces. The review of the practices within the Canadian Public Service demonstrated the colonial worldview at work by revealing discriminatory, racist, hostile and exclusionary treatment of Black and Indigenous employees. Such actions, however, are not confined to the Canadian Public Service. Instead, the continued existence and growing importance of EDI policies in our workspaces indicates the presence of continued inequities towards racialized and marginalized groups. A strategy to decolonize the workplace at the structural and systemic level was offered via the Indigenous worldview. Here, we saw that the Indigenous worldview provides an alternative to the recognized imbalances in current business structures and systems because of its capacity to create workspaces that organically reflect the principles of EDI. The example of Sanala Planning is offered as an illustration of the potential of the Indigenous worldview to create successful and financially profitable work environments. At the individual level, the Unit ends by speaking directly to you, the reader, whose role is central in creating decolonized workspaces. By adapting the individual actions discussed above, you can become agents of change in “paving a way for a workforce where everyone can thrive” (Lowe, 2024, 14:11).
References
Benefits Canada. (2023). 72% of business leaders increased investment in DEI over past year: survey. News. https://www.benefitscanada.com/news/bencan/72-of-business-leaders-increased-investment-in-dei-over-past-year-survey/
Beebe, S., Beebe, S., Ivy, D., & Watson, S. (2005). Communication principles for a lifetime (Canadian Edition). Pearson.
Carufel, R. (2024). Despite a year of attacks and criticism, business leaders continue to support DEI initiatives: New research examines DEI progress and today’s challenges. Agility PR solutions. Despite a year of attacks and criticism, business leaders continue to support DEI initiatives: New research examines DEI progress and today’s challenges – Agility PR Solutions
CGSMUS. (2023). Decolonisation [Video]. Youtube. https://www.youtube.com/watch?v=6Km_TmRuk7Q
Doerr, A. (2013). Public service. The Canadian Encyclopedia. https://www.thecanadianencyclopedia.ca/en/article/public-service
Ebalaroza-Tunnell, G. (2024). Decolonizing the workplace: Building a more equitable future. Medium. https://medium.com/@DrGerryEbalarozaTunnell/decolonizing-the-workplace-building-a-more-equitable-future-87022f903187
Egale Canada. (2024). Indigenous workbook. Building bridges. chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://egale.ca/wp-content/uploads/2024/02/2.-Indigenization-Workbook_Final.pdf
Ermine, W. (2007). The ethical space of engagement. Indigenous Law Journal 6(1). https://jps.library.utoronto.ca/index.php/ilj/issue/view/1822
Fielding News. (2022). Why making vital distinctions between Indigenous and dominant worldviews is not a “binary thinking problem”. Fielding Graduate University. https://www.fielding.edu/making-vital-distinctions-between-indigenous-and-dominant-worldviews/
Golden, H. (2024). History of DEI: The evolution of diversity training programs – NDNU. Notre Dame de Namur University. https://www.ndnu.edu/history-of-dei-the-evolution-of-diversity-training-programs/
Government of Canada. (2017). Many voices one mind: A pathway to reconciliation. https://www.canada.ca/en/government/publicservice/wellness-inclusion-diversity-public-service/diversity-inclusion-public-service/knowledge-circle/many-voices.html#toc7
Indigenous Canada. (n.d.). World View. University of Alberta. https://www.coursera.org/lecture/indigenous-canada/indigenous-worldviews-xQwnm
Indigenous Corporate Training, Inc. (2016). Indigenous worldview vs. western worldview. https://www.ictinc.ca/blog/indigenous-worldviews-vs-western-worldviews
Kilawna, K. (2022). Meet the sqilxw women who are decolonizing the workplace. IndigiNews. https://indiginews.com/features/elaine-alec-shares-what-it-means-to-decolonize-the-workplace?gad_source=1&gclid=EAIaIQobChMI_8HK2e-BiQMVBDYIBR2UfDUEEAAYAyAAEgLUh_D_BwE
Kouri-Towe, N., & Martel-Perry, M. (2024). Better practices in the classroom. Concordia University. https://opentextbooks.concordia.ca/teachingresource/
Potter, R., & Hylton. T. (2019). Intercultural relations. Technical writing essentials. https://pressbooks.senecapolytechnic.ca/technicalwriting/wp-admin/post.php?post=2253&action=edit
Public Service Alliance of Canada. (2024). Shocking internal report exposes rampant discrimination at the head of Canada’s public service. https://psacunion.ca/shocking-internal-report-exposes-rampant
Ravishankar, R. A. (2023). A guide to becoming a better ally. Harvard Business Review. https://hbr.org/2023/06/a-guide-to-becoming-a-better-ally
Rose, M. (2021). Indigenous worldview: What is it, and how it is different [Video]? Youtube . https://www.youtube.com/watch?v=4KzqMMYatc4
Sanala. (n.d.). salana.com. https://www.sanalaplanning.com/
Skoden. (2022). Teaching, talking, and sharing about and for reconciliation. Seneca College. Skoden – Simple Book Publishing
Sue, D. W. (2021). Microaggressions and the “lived experience” of marginality. Division 45. https://division45.org/microaggressions-and-the-lived-experience-of-marginality/
Sue, D. W., Calle, C. Z., Mendez, N., Alsaidi, S., Glaeser, E. (2020). Microintervention strategies: What you can do to disarm and dismantle individual and systemic racism and bias. https://www.google.ca/books/edition/Microintervention_Strategies/GaIQEAAAQBAJ?hl=en&gbpv=1&pg=PA1&printsec=frontcover.
TedxTalks. (2024). Decolonizing the workplace [Video]. TedxFrisco. https://www.youtube.com/watch?v=Lg7fMbN1MjI
Thurton, D. (2024). Internal report describes a ‘cesspool of racism’ in the federal public service. CBCNews. https://www.cbc.ca/news/politics/racism-federal-public-service-black-1.7378963
United Nations. (2023). Summary of the panel discussion on the negative impact of the legacies of colonialism on the enjoyment of human rights. General Assembly. chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://www.ohchr.org/sites/default/files/documents/hrbodies/hrcouncil/sessions-regular/session54/A_HRC_54_4_accessible.pdf
Unit Nations (n.d.). Racism, discrimination are legacies of colonialism. The Office of the High Commissioner for Human Rights. https://www.ohchr.org/en/get-involved/stories/racism-discrimination-are-legacies-colonialism#:~:text=Cal%C3%AD%20Tzay%20added%20that%20the,of%20language%20and%20culture%2C%20and
Wilson, K., & Hodgson, C. (2018). Colonization. Pulling Together: Foundations Guide. https://opentextbc.ca/indigenizationfoundations/chapter/colonization/
Williams, J. (2020). Why corporate diversity programs fail and why small tweaks can have big impacts [Video]. TedxMileHigh. https://www.ted.com/talks/joan_c_williams_why_corporate_diversity_programs_fail_and_how_small_tweaks_can_have_big_impact
Williams, J. C., & Mihaylo, S. (2019). How the best bosses interrupt biases on their teams. Harvard Business Review. https://hbr.org/2019/11/how-the-best-bosses-interrupt-bias-on-their-teams
Workday. (2024). Workday DEI landscape report: Business leaders remain committed in 2024. Human Resources. https://blog.workday.com/en-us/workday-dei-landscape-report-business-leaders-remain-committed-2024.html#:~:text=While%20DEI%20faced%20challenges%20in,of%2011%25%20compared%20to%20last
In the era of AI hallucinations, “fake news,” deliberate misinformation, and “alternative facts,” businesses must rely on solid and verifiable information to move their efforts forward. Finding reliable information can be easy, if you know where to look and how to evaluate it. You may already be familiar with traditional sources of information, like library databases, government publications, journals, and the like, but with the increasing use of AI in data gathering, you may also want to consider non-traditional sources as well. This chapter will guide you on finding credible sources and evaluating research information using traditional and non-traditional resources.
Finding Research Information
Research can be obtained from primary and secondary sources. Primary research consists of original work, like experiments, focus groups, interviews, and the like, that generates raw information or data that are then interpreted in reporting. Secondary research consists of finding information and data that have been gathered by others and typically reported and published in some usable form. More often than not, researchers use both types of research in order to create a balance between original data and those already interpreted by other researchers.
Primary Research
Primary research is any research that you do yourself in which you collect raw data directly from the “real world” rather than from articles, books, or internet sources that have already collected and analyzed the data. Primary research in business is most often gathered through interviews, surveys and observations:
- Interviews: one-on-one or small group question and answer sessions. Interviews will provide detailed information from a small number of people and are useful when you want to get an expert opinion on your topic. For such interviews, you may need to have the participants sign an informed consent form before you begin.
- Surveys/Questionnaires: a form of questioning that is less flexible than interviews, as the questions are set ahead of time and cannot be changed. These involve much larger groups of people than interviews but result in fewer detailed responses. Informed consent is made a condition of survey completion: The purpose of the survey/questionnaire and how data will be treated are explained in the introductory. Participants then choose to proceed or not.
- Naturalistic observation in non-public venues: involves taking organized notes about occurrences related to your research. Observations allow you to gain objective information without the potentially biased viewpoint of an interview or survey. In naturalistic observations, the goal is to be as unobtrusive as possible, so that your presence does not influence or disturb the normal activities you want to observe. If you want to observe activities in a specific workplace, classroom, or other non-public places, you must first seek permission from the manager of that place and let participants know the nature of the observation. Observations in public places do not normally require approval. However, you may not photograph or video your observations without first getting the participants’ informed voluntary consent and permission.
While these are the most common methods, others are also gaining traction. Some examples of primary research include engaging with people and their information via social media, creating focus groups, engaging in beta-testing or prototype trials, medical and psychological studies, etc., some of which require a detailed review process.
Secondary Research
Secondary research information can be obtained from a variety of sources, some of which involve a slow publication process such as academic journals, while others involve a more rapid publication process, such as magazines (see Figure 7.3.1). Academic journals typically involve a slow publication process due to the peer review cycle. They contain articles written by scholars, often presenting their original research, reviewing the original research of others, or performing a “meta-analysis” (an analysis of multiple studies that analyze a given topic). The peer review process involves the evaluation and critique of pre-publication versions of articles, which give the authors the opportunity to justify and revise their work. Once this process is complete (it may take several review cycles and up to two years), the article is then published. This often rigorous peer review process is what helps to validate research reporting and why such articles are considered of greater reliability than unreviewed materials.
Figure 7.3.1 Examples of Popular vs Scholarly Sources (Last, 2019).1
Valid information can also be found in popular publications, however. Such publications have a more rapid publication process without peer review. Though the contents of articles published here are often of high quality, they are always the subject of extra scrutiny and skepticism when used in research because of the lack of peer oversight. In addition, such publications may have editorial boards that serve specific political, religious, economic, or social agendas, which may create a bias in the type of content offered. So be selective as to which popular publication you turn to for information.
For more information on popular vs scholarly articles, watch this Seneca Libraries video: Popular and Scholarly Resources.
Traditional academic sources: Scholarly articles published in academic journals are usually required sources in academic research; they are also an integral part of business reports. But they are not the only sources for credible information. Since you are researching in a professional field and preparing for the workplace, you will draw upon many kinds of traditional credible sources. Table 7.3.1 lists several types of such sources.
Table 7.3.1 Typical traditional academic research sources for business
[Skip Table] |
|
Traditional Academic Secondary Sources | Description |
---|---|
Academic Journals, Conference Papers, Dissertations, etc. |
Scholarly (peer-reviewed) academic sources publish primary research done by professional researchers and scholars in specialized fields, as well as reviews of that research by other specialists in the same field. For example, the Journal of Computer and System Sciences publishes original research papers in computer science and related subjects in system science; the International Journal of Business Communication is one of the most highly ranked journals in the field. |
Reference Works—often considered tertiary sources |
Specialized encyclopedias, handbooks, and dictionaries can provide useful terminology and background information. For example, the Encyclopedia of Business and Finance is a widely recognized authoritative source. You may cite Wikipedia or dictionary.com in a business report, but be sure to compare the information to other reliable sources before use. |
Books
Chapters in Books |
Books written by specialists in a given field usually contain reliable information and a References section that can be very helpful in providing a wealth of additional sources to investigate. For example, The Essential Guide to Business Communication for Finance Professionals by Jason L. Snyder and Lisa A.G. Frank. has an excellent chapter on presentation skills. |
Trade Magazines and Popular Science Magazines |
Reputable trade magazines contain articles relating to current issues and innovations; therefore, they can be useful in determining what is “cutting edge," or finding out what current issues or controversies are affecting business. Examples include The Harvard Business Review, The Economist, and Forbes. |
Newspapers (Journalism) |
Newspaper articles and media releases offer a sense of what journalists and people in industry think the general public should know about a given topic. Journalists report on current events and recent innovations; more in-depth “investigative journalism” explores a current issue in greater detail. Newspapers also contain editorial sections that provide personal opinions on these events and issues. Choose well-known, reputable newspapers such as The Globe and Mail. |
Industry Websites (.com) |
Commercial websites are generally intended to “sell" a product or service, so you have to select information carefully. These websites can also give you insights into a company’s “mission statement,” organization, strategic plan, current or planned projects, archived information, white papers, business reports, product details, costs estimates, annual reports, etc. |
Organization Websites (.org) |
A vast array of .org sites can be very helpful in supplying data and information. These are often public service sites and are designed to share information with the public. |
Government Publications and Public Sector Websites (.gov/.edu/.ca) |
Government departments often publish reports and other documents that can be very helpful in determining public policy, regulations, and guidelines that should be followed. Statistics Canada, for example, publishes a wide range of data. University websites also offer a wide array of non-academic information, such as strategic plans, facilities information, etc. |
Patents |
You may have to distinguish your innovative idea from previously patented ideas; you can look these up and get detailed information on patented or patent-pending ideas. |
Public Presentations |
Public consultation meetings and representatives from business and government speak to various audiences about current issues and proposed projects. These can be live presentations or video presentations available on YouTube or TED talks. |
Other |
Can you think of some more? (Radio programs, podcasts, social media, etc.) |
You may want to check out Seneca Libraries Business Library Guide for information on various academic sources for business research information.
Business-related sources: The use of AI and other technologies such as sensors and satellites in information collection has also led to other types of information sources that may not be considered the norm, but which nonetheless offer equally valuable and credible research information. Some of these are used to collect large amounts of data (big data). Here in Table 7.3.2 is a selection of such sources (Note: Copilot and Elicit are the only approved AI tools at Seneca):
Table 7.3.2 Business-related secondary sources (CIRI, 2018; Microsoft, 2025)*
*NOTE: These non-traditional research sources are used in business and industry, but would not normally be used for academic research.
Business-Related Secondary Sources | Description |
Automated Searches | AI tools like Elicit and Consensus will use key words and research questions to quickly aggregate relevant academic publications available in the Semantic Scholar database. Such tools offer additional features, such as summarization, insights, and citations. ChatGPT, Copilot, Gemini, Claude, and other LLM models are also being used for searches. Though their accuracy and citations are improving and vary from model to model, output must be carefully reviewed prior to use. |
Social Media
LinkedIn, X, Blue Sky, Mastodon, Facebook Groups |
Depending on who you follow, you can find brief useable content and links to articles and other information that can be used as research sources on social media. For example, experts on LinkedIn, Blue Sky, and X often share their published research, blog posts, videos, and other materials that are then discussed by other experts. The original posts, attachments/links, and comments are all rich sources of information, but they must be carefully evaluated prior to use. |
Alternative Media | News sources such as found in Canadian sites like Rabble and magazines like The Walrus offer timely articles on current events, politics, and social issues. A variety of alternative news sources exist that satisfy a range of political and social perspectives, both from the left and the right, so you must use a critical lens to evaluate these sources for bias. |
Alternative Financial and Other Worlds | Cryptocurrency and metaverse (simulated worlds and gaming) data are growing sources of information for consumer behaviour, investment, game development, and financial trends. |
Blogs | Experts in various fields post theories, research findings, and developing thinking on the latest topics in their field on blogs like Wordpress or Substack or on personal websites. Be selective as blog postings usually consist of work in progress, so the thinking can change depending on what the author's research reveals over time. |
Online Communities | Sites like Reddit and Discord are digital communities where posts are organized according to interest areas. Carefully evaluate all information for credibility and accuracy as the posts are often opinion and experience based. |
Crowdsourcing Platforms | Research into new products, services, and tools that are in the pre-release phases can be done by searching not only patents but also crowdsourcing platforms like Kickstarter and StackExchange. |
Sensor and Wearable Technologies | Sensor and wearable technology data such as heart monitors, fitness sensors, diabetes monitors, car health, smart meters, and the lise all aid in the collection of personal, physical, mechanical, and environmental data and the like. |
Mobile and Transactional Applications | Mobile apps will collect data on user behaviour, consumer preferences, and other information. Transactional technologies will capture data relating to financial, retail, consumer behaviour and preferences, and the like |
Website and Other Internet Technologies | The internet can collect data about consumer behaviours and operations from various devices as they link to the network. Specific websites collect data on usage, page views, and consumer behaviours. Some tools will scrape the internet to collect data on competitors, market price trends, and product availability. |
Satellite and Geolocation Technologies | Satellite data and imagery, such as gathered by Orbital Insight, can assist with geospatial information gathering that can be used areas like supply chain, real estate, as well as governance and geopolitics. Geolocation data is collected from devices that permit the gathering of information based on geographical location. SafeGraph and Placer.ai, for example, gather data and create insights relating to location, foot traffic, community context, and more to develop marketing and branding information. |
Weather and Environmental Technologies | In the real estate, manufacturing, supply chain, aviation, and agricultural sectors, for example, instruments gather data that will provide information on weather patterns and trends, air and water quality, soil and air conditions, water levels and conditions, as well as other factors. |
Be reminded that when gathering, storing, and using data, especially that related to people's personal information, you must abide by the Canadian federal and provincial privacy laws. Please refer to the Personal Information Protection and Electronic Documents Act (PIPEDA) and Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA) for more information. See more about protecting people's information in Chapter 7.5 Research Ethics.
Deep Research Reasoners and Agents
LLMs have developed autonomous reasoning capabilities to the point that the technologies can now complete complex research, and paired with agentic technologies, they can do so autonomously (McFarland, 2025; Mollick, 2025). Open AI's Deep Research and Google's Deep Research, for example, are AI agents (autonomous, task-focused AI tools) that are able to produce copious and largely reliable analyses, summaries, insights, and other materials based on materials and direction provided by researchers.
The Google and OpenAI tools are not the same, however. As Alex McFarland (2025), AI Journalist, points out, the capabilities of each model differ substantially. For example, OpenAI's Deep Research will go about the research in a less structured manner than Google's model resulting in broader and more comprehensive research results that are reliable and of high quality (Mollick, 2025). The output reveals that the agent in OpenAI's case, will take a more creative approach that follows information as it is revealed and takes unexpected avenues of search to reveal deep information. In addition OpenAI's model will show its reasoning process (McFarland, 2025; Mollick, 2025). On the other hand, Google's Deep Research uses a more structured approach that relies heavily on information provided by the user (McFarland, 2025). The output will be limited to primary information, reports, and links that are often culled from websites including paywalled sites that vary in quality and reliability (Mollick, 2025).
The kind of research you are doing will help determine what you want to pay for access. According to McFarland (2025), if you are a professional in finance, academia, or policy, for example, it would be worth it to use OpenAI's model; on the other hand, if you are a casual user, then the Google model would be better suited for you. Be reminded that Copilot is the only approved LLM for use at Seneca and really should not be used for research purposes.
Evaluating Research Materials
Mark Twain, supposedly quoting British Prime Minister Benjamin Disraeli, famously said, "There are three kinds of lies: lies, damned lies, and statistics." On the other hand, H.G. Wells has been (mis)quoted as stating, "statistical thinking will one day be as necessary for efficient citizenship as the ability to read and write" (Quora, n.d.). The fact that the actual sources for both “quotations” are unverifiable makes their sentiments no less true. The effective use of statistics can play a critical role in influencing public opinion as well as persuading in the workplace. However, as the fame of the first quotation indicates, statistics can be used to mislead rather than accurately inform—whether intentionally or unintentionally.
The importance of critically evaluating your sources for authority, relevance, timeliness, and credibility can therefore not be overstated. Anyone can put anything on the internet; and people with strong web and document design skills can make this information look very professional and credible—even if it isn't. Moreover, LLMs are notorious for producing content that is in accurate and often unverifiable. Since much research is currently done online, and many sources are available electronically, developing your critical evaluation skills is crucial to finding valid, credible evidence to support and develop your ideas. In fact, corroboration of information has become such a challenging issue that there are sites like this List of Predatory Journals that regularly update its online list of journals that subvert the peer review process and simply publish for profit.
When evaluating research sources, regardless of their origin (LLM or traditional research) be careful to critically evaluate the authority, content, and purpose of the material, using questions in Table 7.3.3. You should also ensure that the claims included in LLM output include source information and that you corroborate or check those sources for accuracy. For more information on evaluating sources, also view this brief Seneca Libraries video, Evaluating Websites.
And it may be tempting to use an LLM to go through the output it created to check for accuracy and bias. But why would you do that? If it created erroneous or unsupported content to begin with, you would not want it to check its own output. That's your responsibility. Here are some tools to help you through this process.
Table 7.3.3 A question-guide for evaluations of the authority, content, and purpose of information
[Skip Table] |
|
Authority Researchers Authors Creators |
Who are the researchers/authors/creators? Who is their intended audience? What are their credentials/qualifications? What else has this author written? Is this research funded? By whom? Who benefits? Who has intellectual ownership of this idea? How do I cite it? Where is this source published? What kind of publication is it? Authoritative Sources: written by experts for a specialized audience, published in peer-reviewed journals or by respected publishers, and containing well-supported, evidence-based arguments. Popular Sources: written for a general (or possibly niche) public audience, often in an informal or journalistic style, published in newspapers, magazines, and websites with a purpose of entertaining or promoting a product; evidence is often “soft” rather than hard. |
---|---|
Content |
Methodology What is the methodology of the study? Or how has evidence been collected? Is the methodology sound? Can you find obvious flaws? What is its scope? Does it apply to your project? How? How recent and relevant is it? What is the publication date or last update? |
Data Is there sufficient data to support their claims or hypotheses? Do they offer quantitative and/or qualitative data? Are visual representations of the data misleading or distorted in some way? |
|
Purpose |
Why has this author presented this information to this audience? Why am I using this source? Will using this source bolster my credibility or undermine it? Am I “cherry-picking” – using inadequate or unrepresentative data that only supports my position, while ignoring substantial amount of data that contradicts it? Could “cognitive bias” be at work here? Have I only consulted the kinds of sources I know will support my idea? Have I failed to consider alternative kinds of sources? Am I representing the data I have collected accurately? Are the data statistically relevant or significant? |
Knowledge Check
Beware of Logical Fallacies
We all have biases when we write or argue; however, when evaluating sources, you want to be on the lookout for bias that is unfair, one-sided, or slanted. Here is an example: Given the pie chart in Figure 7.3.2, if you only consulted articles that rejected global warming in a project related to that topic, you would be guilty of cherry-picking and cognitive bias.

When evaluating a source, consider whether the author has acknowledged and addressed opposing ideas, potential gaps in the research, or limits of the data. Look at the kind of language the author uses: Is it slanted, strongly connotative, or emotionally manipulative? Is the supporting evidence presented logically, credibly, and ethically? Has the author cherry-picked or misrepresented sources or ideas? Does the author rely heavily on emotional appeal? There are many logical fallacies that both writers and readers can fall prey to (see Table 7.3.4 and for more information refer to Chapter 3.4 ). It is important to use data ethically and accurately, and to apply logic correctly and validly to support your ideas.
Table 7.3.4 Common logical fallacies
[Skip Table] | |
Bandwagon Fallacy |
Argument from popularity – “Everyone else is doing it, so we should too!” |
---|---|
Hasty Generalization |
Using insufficient data to come to a general conclusion. E.g., An Australian stole my wallet; therefore, all Australians are thieves! |
Unrepresentative Sample |
Using data from a particular subset and generalizing to a larger set that may not share similar characteristics. E.g., Giving a survey to only female students under 20 and generalizing results to all students. |
False Dilemma |
“Either/or fallacy” – presenting only two options when there are usually more possibilities to consider E.g., You're either with us or against us. |
Slippery Slope |
Claiming that a single cause will lead, eventually, to exaggerated catastrophic results. |
Slanted Language |
Using language loaded with emotional appeal and either positive or negative connotation to manipulate the reader |
False Analogy |
Comparing your idea to another that is familiar to the audience but which may not have sufficient similarity to make an accurate comparison E.g., Governing a country is like running a business. |
Post hoc, ergo prompter hoc |
“After this; therefore, because of this” E.g., A happened, then B happened; therefore, A caused B. Just because one thing happened first, does not necessarily mean that the first thing caused the second thing. |
Circular Reasoning |
Circular argument - assuming the truth of the conclusion by its premises. E.g., I never lie; therefore, I must be telling the truth. |
Ad hominem |
An attack on the person making an argument does not really invalidate that person’s argument. It might make them seem a bit less credible, but it does not dismantle the actual argument or invalidate the data. |
Straw Man Argument |
Restating the opposing idea in an inaccurately absurd or simplistic manner to more easily refute or undermine it. |
Others? |
There are many more… can you think of some? For a bit of fun, check out Spurious Correlations. |
Please refer to Chapter 3.4 Writing to Persuade to find out how you can check your work for logical fallacies.
Knowledge Check
Critical thinking lies at the heart of evaluating sources. You want to be rigorous in your selection of evidence because, once you use it in your paper, it will either bolster your own credibility or undermine it.
Notes
- Cover images from journals are used to illustrate the difference between popular and scholarly journals, and are for noncommercial, educational use only.
References
Canadian Investor Relations Institute (CIRI).(2018) CIRI advisor - Using Non-Traditional Data for Market and Competitive Int....pdf
Government of Canada. Statistics Canada. http://www.statcan.gc.ca/eng/start
Last, S. (2019). Technical writing essentials. BCcampus. https://pressbooks.bccampus.ca/technicalwriting/
Microsoft. (2025, January 20). Non-traditional sources of information. Copilot 4o, Microsoft Enterprise Version.
McFarland, A. (2025, updated February 5). Google Gemini vs. OpenAI Deep Research: Which Is Better? - Techopedia
Mollick, E. (2025, February 3). The End of Search, The Beginning of Research
Seneca Libraries. (2013, July 2). Evaluating Websites [Video]. YouTube. https://youtu.be/35PBCC5TKxs
Seneca Libraries. (2013, July 2). Popular and Scholarly Resources [Video]. YouTube. https://youtu.be/wPj-BBB0le4
Seneca Libraries. (2021, January 4 updated). Writing and Communicating Technical Information: Key Resources. Seneca College. https://library.senecacollege.ca/technical/keyresources
What is the source of the H.G. Wells quote, ‘Statistical thinking will one day be as necessary for efficient citizenship as the ability to read and write/”? (n.d.). Quora. https://www.quora.com/What-is-the-source-of-the-H-G-Wells-quote-Statistical-thinking-will-one-day-be-as-necessary-for-efficient-citizenship-as-the-ability-to-read-and-write
Wright-Tau, C. (2013). There’s No Denying It: Global Warming is Happening | Home on the Range
In the era of AI hallucinations, “fake news,” deliberate misinformation, and “alternative facts,” businesses must rely on solid and verifiable information to move their efforts forward. Finding reliable information can be easy, if you know where to look and how to evaluate it. You may already be familiar with traditional sources of information, like library databases, government publications, journals, and the like, but with the increasing use of AI in data gathering, you may also want to consider non-traditional sources as well. This chapter will guide you on finding credible sources and evaluating research information using traditional and non-traditional resources.
Finding Research Information
Research can be obtained from primary and secondary sources. Primary research consists of original work, like experiments, focus groups, interviews, and the like, that generates raw information or data that are then interpreted in reporting. Secondary research consists of finding information and data that have been gathered by others and typically reported and published in some usable form. More often than not, researchers use both types of research in order to create a balance between original data and those already interpreted by other researchers.
Primary Research
Primary research is any research that you do yourself in which you collect raw data directly from the “real world” rather than from articles, books, or internet sources that have already collected and analyzed the data. Primary research in business is most often gathered through interviews, surveys and observations:
- Interviews: one-on-one or small group question and answer sessions. Interviews will provide detailed information from a small number of people and are useful when you want to get an expert opinion on your topic. For such interviews, you may need to have the participants sign an informed consent form before you begin.
- Surveys/Questionnaires: a form of questioning that is less flexible than interviews, as the questions are set ahead of time and cannot be changed. These involve much larger groups of people than interviews but result in fewer detailed responses. Informed consent is made a condition of survey completion: The purpose of the survey/questionnaire and how data will be treated are explained in the introductory. Participants then choose to proceed or not.
- Naturalistic observation in non-public venues: involves taking organized notes about occurrences related to your research. Observations allow you to gain objective information without the potentially biased viewpoint of an interview or survey. In naturalistic observations, the goal is to be as unobtrusive as possible, so that your presence does not influence or disturb the normal activities you want to observe. If you want to observe activities in a specific workplace, classroom, or other non-public places, you must first seek permission from the manager of that place and let participants know the nature of the observation. Observations in public places do not normally require approval. However, you may not photograph or video your observations without first getting the participants’ informed voluntary consent and permission.
While these are the most common methods, others are also gaining traction. Some examples of primary research include engaging with people and their information via social media, creating focus groups, engaging in beta-testing or prototype trials, medical and psychological studies, etc., some of which require a detailed review process.
Secondary Research
Secondary research information can be obtained from a variety of sources, some of which involve a slow publication process such as academic journals, while others involve a more rapid publication process, such as magazines (see Figure 7.3.1). Academic journals typically involve a slow publication process due to the peer review cycle. They contain articles written by scholars, often presenting their original research, reviewing the original research of others, or performing a “meta-analysis” (an analysis of multiple studies that analyze a given topic). The peer review process involves the evaluation and critique of pre-publication versions of articles, which give the authors the opportunity to justify and revise their work. Once this process is complete (it may take several review cycles and up to two years), the article is then published. This often rigorous peer review process is what helps to validate research reporting and why such articles are considered of greater reliability than unreviewed materials.
Figure 7.3.1 Examples of Popular vs Scholarly Sources (Last, 2019).1
Valid information can also be found in popular publications, however. Such publications have a more rapid publication process without peer review. Though the contents of articles published here are often of high quality, they are always the subject of extra scrutiny and skepticism when used in research because of the lack of peer oversight. In addition, such publications may have editorial boards that serve specific political, religious, economic, or social agendas, which may create a bias in the type of content offered. So be selective as to which popular publication you turn to for information.
For more information on popular vs scholarly articles, watch this Seneca Libraries video: Popular and Scholarly Resources.
Traditional academic sources: Scholarly articles published in academic journals are usually required sources in academic research; they are also an integral part of business reports. But they are not the only sources for credible information. Since you are researching in a professional field and preparing for the workplace, you will draw upon many kinds of traditional credible sources. Table 7.3.1 lists several types of such sources.
Table 7.3.1 Typical traditional academic research sources for business
[Skip Table] |
|
Traditional Academic Secondary Sources | Description |
---|---|
Academic Journals, Conference Papers, Dissertations, etc. |
Scholarly (peer-reviewed) academic sources publish primary research done by professional researchers and scholars in specialized fields, as well as reviews of that research by other specialists in the same field. For example, the Journal of Computer and System Sciences publishes original research papers in computer science and related subjects in system science; the International Journal of Business Communication is one of the most highly ranked journals in the field. |
Reference Works—often considered tertiary sources |
Specialized encyclopedias, handbooks, and dictionaries can provide useful terminology and background information. For example, the Encyclopedia of Business and Finance is a widely recognized authoritative source. You may cite Wikipedia or dictionary.com in a business report, but be sure to compare the information to other reliable sources before use. |
Books
Chapters in Books |
Books written by specialists in a given field usually contain reliable information and a References section that can be very helpful in providing a wealth of additional sources to investigate. For example, The Essential Guide to Business Communication for Finance Professionals by Jason L. Snyder and Lisa A.G. Frank. has an excellent chapter on presentation skills. |
Trade Magazines and Popular Science Magazines |
Reputable trade magazines contain articles relating to current issues and innovations; therefore, they can be useful in determining what is “cutting edge," or finding out what current issues or controversies are affecting business. Examples include The Harvard Business Review, The Economist, and Forbes. |
Newspapers (Journalism) |
Newspaper articles and media releases offer a sense of what journalists and people in industry think the general public should know about a given topic. Journalists report on current events and recent innovations; more in-depth “investigative journalism” explores a current issue in greater detail. Newspapers also contain editorial sections that provide personal opinions on these events and issues. Choose well-known, reputable newspapers such as The Globe and Mail. |
Industry Websites (.com) |
Commercial websites are generally intended to “sell" a product or service, so you have to select information carefully. These websites can also give you insights into a company’s “mission statement,” organization, strategic plan, current or planned projects, archived information, white papers, business reports, product details, costs estimates, annual reports, etc. |
Organization Websites (.org) |
A vast array of .org sites can be very helpful in supplying data and information. These are often public service sites and are designed to share information with the public. |
Government Publications and Public Sector Websites (.gov/.edu/.ca) |
Government departments often publish reports and other documents that can be very helpful in determining public policy, regulations, and guidelines that should be followed. Statistics Canada, for example, publishes a wide range of data. University websites also offer a wide array of non-academic information, such as strategic plans, facilities information, etc. |
Patents |
You may have to distinguish your innovative idea from previously patented ideas; you can look these up and get detailed information on patented or patent-pending ideas. |
Public Presentations |
Public consultation meetings and representatives from business and government speak to various audiences about current issues and proposed projects. These can be live presentations or video presentations available on YouTube or TED talks. |
Other |
Can you think of some more? (Radio programs, podcasts, social media, etc.) |
You may want to check out Seneca Libraries Business Library Guide for information on various academic sources for business research information.
Business-related sources: The use of AI and other technologies such as sensors and satellites in information collection has also led to other types of information sources that may not be considered the norm, but which nonetheless offer equally valuable and credible research information. Some of these are used to collect large amounts of data (big data). Here in Table 7.3.2 is a selection of such sources (Note: Copilot and Elicit are the only approved AI tools at Seneca):
Table 7.3.2 Business-related secondary sources (CIRI, 2018; Microsoft, 2025)*
*NOTE: These non-traditional research sources are used in business and industry, but would not normally be used for academic research.
Business-Related Secondary Sources | Description |
Automated Searches | AI tools like Elicit and Consensus will use key words and research questions to quickly aggregate relevant academic publications available in the Semantic Scholar database. Such tools offer additional features, such as summarization, insights, and citations. ChatGPT, Copilot, Gemini, Claude, and other LLM models are also being used for searches. Though their accuracy and citations are improving and vary from model to model, output must be carefully reviewed prior to use. |
Social Media
LinkedIn, X, Blue Sky, Mastodon, Facebook Groups |
Depending on who you follow, you can find brief useable content and links to articles and other information that can be used as research sources on social media. For example, experts on LinkedIn, Blue Sky, and X often share their published research, blog posts, videos, and other materials that are then discussed by other experts. The original posts, attachments/links, and comments are all rich sources of information, but they must be carefully evaluated prior to use. |
Alternative Media | News sources such as found in Canadian sites like Rabble and magazines like The Walrus offer timely articles on current events, politics, and social issues. A variety of alternative news sources exist that satisfy a range of political and social perspectives, both from the left and the right, so you must use a critical lens to evaluate these sources for bias. |
Alternative Financial and Other Worlds | Cryptocurrency and metaverse (simulated worlds and gaming) data are growing sources of information for consumer behaviour, investment, game development, and financial trends. |
Blogs | Experts in various fields post theories, research findings, and developing thinking on the latest topics in their field on blogs like Wordpress or Substack or on personal websites. Be selective as blog postings usually consist of work in progress, so the thinking can change depending on what the author's research reveals over time. |
Online Communities | Sites like Reddit and Discord are digital communities where posts are organized according to interest areas. Carefully evaluate all information for credibility and accuracy as the posts are often opinion and experience based. |
Crowdsourcing Platforms | Research into new products, services, and tools that are in the pre-release phases can be done by searching not only patents but also crowdsourcing platforms like Kickstarter and StackExchange. |
Sensor and Wearable Technologies | Sensor and wearable technology data such as heart monitors, fitness sensors, diabetes monitors, car health, smart meters, and the like all aid in the collection of personal, physical, mechanical, and environmental data. |
Mobile and Transactional Applications | Mobile apps will collect data on user behaviour, consumer preferences, and other information. Transactional technologies will capture data relating to financial, retail, consumer behaviour and preferences, and the like |
Website and Other Internet Technologies | The internet can collect data about consumer behaviours and operations from various devices as they link to the network. Specific websites collect data on usage, page views, and consumer behaviours. Some tools will scrape the internet to collect data on competitors, market price trends, and product availability. |
Satellite and Geolocation Technologies | Satellite data and imagery, such as gathered by Orbital Insight, can assist with geospatial information gathering that can be used areas like supply chain, real estate, as well as governance and geopolitics. Geolocation data is collected from devices that permit the gathering of information based on geographical location. SafeGraph and Placer.ai, for example, gather data and create insights relating to location, foot traffic, community context, and more to develop marketing and branding information. |
Weather and Environmental Technologies | In the real estate, manufacturing, supply chain, aviation, and agricultural sectors, for example, instruments gather data that will provide information on weather patterns and trends, air and water quality, soil and air conditions, water levels and conditions, as well as other factors. |
Be reminded that when gathering, storing, and using data, especially that related to people's personal information, you must abide by the Canadian federal and provincial privacy laws. Please refer to the Personal Information Protection and Electronic Documents Act (PIPEDA) and Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA) for more information. See more about protecting people's information in Chapter 7.5 Research Ethics.
Deep Research Reasoners and Agents
LLMs have developed autonomous reasoning capabilities to the point that the technologies can now complete complex research, and paired with agentic technologies, they can do so autonomously (McFarland, 2025; Mollick, 2025). Open AI's Deep Research and Google's Deep Research, for example, are AI agents (autonomous, task-focused AI tools) that are able to produce copious and largely reliable analyses, summaries, insights, and other materials based on materials and direction provided by researchers.
The Google and OpenAI tools are not the same, however. As Alex McFarland (2025), AI Journalist, points out, the capabilities of each model differ substantially. For example, OpenAI's Deep Research will go about the research in a less structured manner than Google's model resulting in broader and more comprehensive research results that are reliable and of high quality (Mollick, 2025). The output reveals that the agent in OpenAI's case, will take a more creative approach that follows information as it is revealed and takes unexpected avenues of search to reveal deep information. In addition OpenAI's model will show its reasoning process (McFarland, 2025; Mollick, 2025). On the other hand, Google's Deep Research uses a more structured approach that relies heavily on information provided by the user (McFarland, 2025). The output will be limited to primary information, reports, and links that are often culled from websites including paywalled sites that vary in quality and reliability (Mollick, 2025).
The kind of research you are doing will help determine what you want to pay for access. According to McFarland (2025), if you are a professional in finance, academia, or policy, for example, it would be worth it to use OpenAI's model; on the other hand, if you are a casual user, then the Google model would be better suited for you. Be reminded that Copilot is the only approved LLM for use at Seneca and really should not be used for research purposes.
Evaluating Research Materials
Mark Twain, supposedly quoting British Prime Minister Benjamin Disraeli, famously said, "There are three kinds of lies: lies, damned lies, and statistics." On the other hand, H.G. Wells has been (mis)quoted as stating, "statistical thinking will one day be as necessary for efficient citizenship as the ability to read and write" (Quora, n.d.). The fact that the actual sources for both “quotations” are unverifiable makes their sentiments no less true. The effective use of statistics can play a critical role in influencing public opinion as well as persuading in the workplace. However, as the fame of the first quotation indicates, statistics can be used to mislead rather than accurately inform—whether intentionally or unintentionally.
The importance of critically evaluating your sources for authority, relevance, timeliness, and credibility can therefore not be overstated. Anyone can put anything on the internet; and people with strong web and document design skills can make this information look very professional and credible—even if it isn't. Moreover, LLMs are notorious for producing content that is in accurate and often unverifiable. Since much research is currently done online, and many sources are available electronically, developing your critical evaluation skills is crucial to finding valid, credible evidence to support and develop your ideas. In fact, corroboration of information has become such a challenging issue that there are sites like this List of Predatory Journals that regularly update its online list of journals that subvert the peer review process and simply publish for profit.
When evaluating research sources, regardless of their origin (LLM or traditional research) be careful to critically evaluate the authority, content, and purpose of the material, using questions in Table 7.3.3. You should also ensure that the claims included in LLM output include source information and that you corroborate or check those sources for accuracy. For more information on evaluating sources, also view this brief Seneca Libraries video, Evaluating Websites.
And it may be tempting to use an LLM to go through the output it created to check for accuracy and bias. But why would you do that? If it created erroneous or unsupported content to begin with, you would not want it to check its own output. That's your responsibility. Here are some tools to help you through this process.
Table 7.3.3 A question-guide for evaluations of the authority, content, and purpose of information
[Skip Table] |
|
Authority Researchers Authors Creators |
Who are the researchers/authors/creators? Who is their intended audience? What are their credentials/qualifications? What else has this author written? Is this research funded? By whom? Who benefits? Who has intellectual ownership of this idea? How do I cite it? Where is this source published? What kind of publication is it? Authoritative Sources: written by experts for a specialized audience, published in peer-reviewed journals or by respected publishers, and containing well-supported, evidence-based arguments. Popular Sources: written for a general (or possibly niche) public audience, often in an informal or journalistic style, published in newspapers, magazines, and websites with a purpose of entertaining or promoting a product; evidence is often “soft” rather than hard. |
---|---|
Content |
Methodology What is the methodology of the study? Or how has evidence been collected? Is the methodology sound? Can you find obvious flaws? What is its scope? Does it apply to your project? How? How recent and relevant is it? What is the publication date or last update? |
Data Is there sufficient data to support their claims or hypotheses? Do they offer quantitative and/or qualitative data? Are visual representations of the data misleading or distorted in some way? |
|
Purpose |
Why has this author presented this information to this audience? Why am I using this source? Will using this source bolster my credibility or undermine it? Am I “cherry-picking” – using inadequate or unrepresentative data that only supports my position, while ignoring substantial amount of data that contradicts it? Could “cognitive bias” be at work here? Have I only consulted the kinds of sources I know will support my idea? Have I failed to consider alternative kinds of sources? Am I representing the data I have collected accurately? Are the data statistically relevant or significant? |
Knowledge Check
Beware of Logical Fallacies
We all have biases when we write or argue; however, when evaluating sources, you want to be on the lookout for bias that is unfair, one-sided, or slanted. Here is an example: Given the pie chart in Figure 7.3.2, if you only consulted articles that rejected global warming in a project related to that topic, you would be guilty of cherry-picking and cognitive bias.

When evaluating a source, consider whether the author has acknowledged and addressed opposing ideas, potential gaps in the research, or limits of the data. Look at the kind of language the author uses: Is it slanted, strongly connotative, or emotionally manipulative? Is the supporting evidence presented logically, credibly, and ethically? Has the author cherry-picked or misrepresented sources or ideas? Does the author rely heavily on emotional appeal? There are many logical fallacies that both writers and readers can fall prey to (see Table 7.3.4 and for more information refer to Chapter 3.4 ). It is important to use data ethically and accurately, and to apply logic correctly and validly to support your ideas.
Table 7.3.4 Common logical fallacies
[Skip Table] | |
Bandwagon Fallacy |
Argument from popularity – “Everyone else is doing it, so we should too!” |
---|---|
Hasty Generalization |
Using insufficient data to come to a general conclusion. E.g., An Australian stole my wallet; therefore, all Australians are thieves! |
Unrepresentative Sample |
Using data from a particular subset and generalizing to a larger set that may not share similar characteristics. E.g., Giving a survey to only female students under 20 and generalizing results to all students. |
False Dilemma |
“Either/or fallacy” – presenting only two options when there are usually more possibilities to consider E.g., You're either with us or against us. |
Slippery Slope |
Claiming that a single cause will lead, eventually, to exaggerated catastrophic results. |
Slanted Language |
Using language loaded with emotional appeal and either positive or negative connotation to manipulate the reader |
False Analogy |
Comparing your idea to another that is familiar to the audience but which may not have sufficient similarity to make an accurate comparison E.g., Governing a country is like running a business. |
Post hoc, ergo prompter hoc |
“After this; therefore, because of this” E.g., A happened, then B happened; therefore, A caused B. Just because one thing happened first, does not necessarily mean that the first thing caused the second thing. |
Circular Reasoning |
Circular argument - assuming the truth of the conclusion by its premises. E.g., I never lie; therefore, I must be telling the truth. |
Ad hominem |
An attack on the person making an argument does not really invalidate that person’s argument. It might make them seem a bit less credible, but it does not dismantle the actual argument or invalidate the data. |
Straw Man Argument |
Restating the opposing idea in an inaccurately absurd or simplistic manner to more easily refute or undermine it. |
Others? |
There are many more… can you think of some? For a bit of fun, check out Spurious Correlations. |
Please refer to Chapter 3.4 Writing to Persuade to find out how you can check your work for logical fallacies.
Knowledge Check
Critical thinking lies at the heart of evaluating sources. You want to be rigorous in your selection of evidence because, once you use it in your paper, it will either bolster your own credibility or undermine it.
Notes
- Cover images from journals are used to illustrate the difference between popular and scholarly journals, and are for noncommercial, educational use only.
References
Canadian Investor Relations Institute (CIRI).(2018) CIRI advisor - Using Non-Traditional Data for Market and Competitive Int....pdf
Government of Canada. Statistics Canada. http://www.statcan.gc.ca/eng/start
Last, S. (2019). Technical writing essentials. BCcampus. https://pressbooks.bccampus.ca/technicalwriting/
Microsoft. (2025, January 20). Non-traditional sources of information. Copilot 4o, Microsoft Enterprise Version.
McFarland, A. (2025, updated February 5). Google Gemini vs. OpenAI Deep Research: Which Is Better? - Techopedia
Mollick, E. (2025, February 3). The End of Search, The Beginning of Research
Seneca Libraries. (2013, July 2). Evaluating Websites [Video]. YouTube. https://youtu.be/35PBCC5TKxs
Seneca Libraries. (2013, July 2). Popular and Scholarly Resources [Video]. YouTube. https://youtu.be/wPj-BBB0le4
Seneca Libraries. (2021, January 4 updated). Writing and Communicating Technical Information: Key Resources. Seneca College. https://library.senecacollege.ca/technical/keyresources
What is the source of the H.G. Wells quote, ‘Statistical thinking will one day be as necessary for efficient citizenship as the ability to read and write/”? (n.d.). Quora. https://www.quora.com/What-is-the-source-of-the-H-G-Wells-quote-Statistical-thinking-will-one-day-be-as-necessary-for-efficient-citizenship-as-the-ability-to-read-and-write
Wright-Tau, C. (2013). There’s No Denying It: Global Warming is Happening | Home on the Range
Regardless of the tools you use to find information to support your ideas, the strength of your reporting on your research findings will always lie with the quality and kind of support included, be it raw or analyzed data or qualitative information (see Chapter 7.3), and how you treat that information. In all cases, the ways in which you collect, analyze, and use data must be ethical and consistent with professional standards of safety, privacy, validity, and overall integrity. Lapses in any of these areas can not only lead to poor quality reports in an academic context (poor grades and academic dishonesty penalties), but in the workplace, these lapses can also lead to lawsuits, loss of job, and even criminal charges. For this reason, making use of a research approach that is characterized by integrity and the pursuit of facts will help you to reveal information that can be used to support your ideas and create strongly argued points. Whether you are using LLMs and other AI tools in your research process or traditional methods, following these principles will go far in bolstering your own professional credibility:
- Ethical use: Researchers are obligated to make ethical use of all research materials, including those generated by LLMs and other AI tools. Ethical use involves ensuring accuracy, avoiding bias, promoting inclusivity, as well as citing and declaring sources. In addition, when humans are subjects of the research, practices that do no harm are required (more on this below).
- Accuracy: Since LLM output is known to often lack accuracy and to contain hallucinations, you must review all output that you use to ensure its accuracy. You also have an obligation to make accurate use of any research information by ensuring that the information represents the intent of the original author(s) when using secondary sources. When using data you generate yourself, you also have an obligation to ensure the integrity of the information.
- Corroboration: Corroboration is especially necessary when using LLM output. When LLMs make claims that are unsupported by accurate citations, researchers must verify them by analyzing reports where original research is published. In traditional research, corroboration of facts is necessary when a study is limited or if it raises questions.
- Declaration of AI Use and Citation: All research sources used in your reporting must be cited using standard methods like the APA, IEEE, or MLA style. When traditional citation methods are not applicable, such as in cases of LLM-human co-creation, researchers must declare their use of genAI applications, including model, mode, date, methods, and prompts.
- Diversity and Inclusion: Some factors become more nuanced given contextual, population, health, and economic considerations. Accounting for the needs and interests of various groups in your research practice will lead to a more inclusive approach to gathering information and analysis of findings.
- Privacy: When gathering, storing, and using data, you must abide by the Canadian federal and provincial privacy laws. Please refer to the Personal Information Protection and Electronic Documents Act (PIPEDA) and Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA) for more information.
- Transparency: Be transparent in your research methods. Most reporting documents include a Methods section where research methods are described. If you have made use of AI technologies, you must include that in the description of methods. Be specific as to which tools, formulas, databases, etc. you have employed. Keep in your records research documentation along with drafts, prompts, and outputs in case they are requested.
- Validity and reliability: Validity and reliability are qualities that your research information must have for it to support your ideas and withstand scrutiny. Validity refers to the "consistency and stability" of the research results, while reliability refers to the "accuracy with which a research measures what it intends to measure" (Salomao, 2023).
When Humans are Subjects of Your Research
Oftentimes, primary research will involve human subjects such as in stakeholder consultations including focus groups, interviews, surveys (see Chapter 7.4), psychological studies, and others. If you are collecting data from human participants, you are engaging in “human research” and you must be aware of and follow strict ethical guidelines of your academic institution or the company you represent.
In Canada, any post-secondary educational institution that receives funding from one of the three federal granting bodies must ensure that all research involving humans conducted at that institution complies with the Tri-Council Policy Statement (2018). As an example of institutional ethics requirements, you may want to consult Seneca's Ethical Conduct for Research Involving Human Subjects Policy (2007). In addition, organizations more broadly in Canada must abide by the standards set out by the Human Research Standards Council and the Digital Governance Standards. These rules and standards are in place to protect people and communities from potential risk or harm from research and to ensure ethical conduct including the protection of privacy. Human research in Canada should align with principles expressed in the Belmont Report and summarized as follows by Last, Lemire, & Hagstrom-Schmidt, 2022):
. . .human subjects research is guided by three core principles outlined in the Belmont Report: respect for persons, beneficence, and justice (The Belmont Report, 1979). The first principle, respect for persons, means that researchers must respect the autonomy of research participants and provide protections against coercion, particularly for vulnerable populations. The second principle, beneficence, means that researchers have an obligation to enact the following rules: “(1) do not harm and (2) maximize possible benefits and minimize possible harms” (The Belmont Report, 1979). The third principle, justice, means that research participation should be distributed, rather than concentrated heavily on one population.
Knowledge Check: Review the video below and respond to questions as they appear.
To ensure that researchers proceed with research involving humans using ethical methods, they must follow an approval process before conducting certain kinds of limited research (such as surveys, interviews, or other kinds of human-involved research). The approval process most often involves presenting their research plan before a Research Ethics Board (REB). REBs follow guidelines and procedures set out by organizations like the Human Research Standards Council and the Digital Governance Council and other regulatory and accreditation bodies. The REB will review the project plan and make recommendations to ensure that humans are treated ethically and that the proposed research is otherwise sound. Check with your organization’s REB prior to beginning a human-involved research project to ensure alignment with its policies.
Avoiding Research Ethics Violations
As a researcher, you are obligated to be aware of the many ways in which your research can fail due to ethical lapses or outright violations. When a considerable amount of time and effort goes into completing research, you would want to ensure that it is collected, treated, stored, and shared in ways that abide by the ethical standards. Here in Figure 7.5.1 are many of the ethical lapses and violations that must be avoided:

The following is a brief explanation of these various lapses and intentional actions that can lead to questions surrounding ethical conduct in research:
No informed consent for data collection: Informed consent involves obtaining the consent of human participants prior to collecting information from them. The process involves researchers informing participants about the research goals and how the information will be used and protected. For example, when planning for a focus group or survey, you must inform the participants and respondents of the research goals, what information about them is being gathered, how it will be used and protected, and whether participation is voluntary or not. Consent can be explicit (as when participants check a box or apply their signature to agree to the process) or implicit as for surveys that include a statement that “completion of these survey questions imply consent.” PIPEDA clearly explains how all organizations and institutions must by the law to protect people’s information.
No permissions obtained for data/information usage: Part of the informed consent process is to also request permission to make use of the data. When making that request, you must be transparent about how the data is going to be used so that participants in your research make an informed consent. According to the PIPEDA, researchers who make use of data in ways that are other than what is specified in the informed consent are violating the law.
Simultaneous and duplicate submissions (Uzon, 2013): A simultaneous or duplicate submission occurs when an author submits a research paper for consideration to two or more publications at the same time. They do this to save time in the process of peer review and to get their work published sooner than later. Such a practice is discouraged by journal publishers and can result in the immediate retraction of your paper if it is found to be published in two journals simultaneously.
Salami slicing: “The term Salami Slicing is used often within academia to refer to the needless separation of a single research study, attached dataset, and argument that should form a single publication, into two or several separate publications for submission in different journals” (Adams, 2022). Authors do this to increase the number of publications in which their work appears and to satisfy pressures to publish which is common in academic settings. Through this process they are also able to increase the odds of being published and obtain simultaneous feedback on various parts of their work.
Copyright infringement: Whether you are working with data or other types of information, publications, art and literary works, etc., if someone else created it then it belongs to them and is protected by copyright laws. It is a copyright infringement to make use of other people’s work in your research and reporting without giving them credit through in text mentions, citation, and documentation (Arnold & Levin, 2021).
LLMs problematize the citation practice, so when a source is unspecified or unclear, you should cite the LLM model when making use of a small amount of output. When making use of a large amount of output, you should be extensively revising and adding content to the output to effectively make it your own. You would then declare use of the LLM instead of citing. More on this topic in Unit 8.
Image manipulation: When using images created by others, it is important to avoid manipulating the images to either distort or misrepresent the information. However, manipulating images to improve clarity is an acceptable practice (Elsevir, 2024). Images must be used fairly in that their original meaning or intent is not altered when you make use of them.
Data fabrication and falsification (Elsevir, 2024): Data fabrication involves making up or creating data or other information to prove your hypothesis or point and support your arguments. Data falsification, on the other hand, involves manipulating existing data, information, as well as research-related procedures, measures, equipment, and materials to gain evidence that aligns with your perspective or argument. It includes ignoring data that obviously disproves or contradicts your ideas and that involves skewing or applying bias to the research. Both fabrication and falsification can have significant impact on research integrity and affect professional practices that have safety, health, financial, political, social, and moral implications. When these actions are determined to be intentional, therefore, they are also considered to be fraudulent activities.
Plagiarism: Plagiarism involves the deliberate use of someone else’s work and passing it off as your own (Arnold & Levin, 2021). Your college writing subjects, COM101 and COM111, spent considerable time and effort to give you an understanding of how and why to avoid academic integrity offences, as well as the consequences of doing so. The practices also extend into the workplace, so it is incumbent upon you to cite and document all information you use that belongs to someone else including when it is created by an LLM. If you would like to review information on plagiarism and other forms of cheating, consult Seneca Polytechnic's Policy on Academic Integrity and Seneca's Student Resources on Academic Integrity. Unit 8 of this text offers extensive information on how to avoid plagiarism issues and make ethical use of the work belonging to others.
(Plagiarism and You, 2020)
Knowledge Check
Authorship issues (CRT, 2025; The Royal Society, n.d.): Research is reported using various vehicles: conference presentations, professional meetings, and publications. These vehicles are evidence of the expertise, professionalization, and credibility of researchers. Research, when shared, can lead to employment, further research opportunities, funding, conference presentations, invited lectures, and other benefits. So, when researchers publish their work, they want their authorship to be fairly represented by being included in the by-line of their publications. However, the process may not be so simple arising to disputes and other misrepresentations. The Royal Society (n.d.) describes the issues as follows:
- Authorship disputes: Authorship disputes arise when two or more researchers who are involved in a range of research activities, including data analysis, investigation, and the like, are in disagreement about their contributions and the type of credit they should receive. CRT (CrediT) (2025) and The Royal Society (n.d.) recommend that contributions to the research be identified and assigned at the outset of a research project and that authorship, including designating the primary author, also be included in the discussions.
- Ghost authorship: Ghost authorship occurs when the name of someone who participated in the research and who would normally be considered an author is not included in the author list.
- Guest authorship: Guest authorship consists of the practice of including in the author list the name of an established and well-respected researcher or expert in the subject in the hopes that this name will boost the credibility of the research and increase the likelihood of publication.
- Gift authorship: Gift authorship consists of the practice of including the name of someone who did not contribute to the research but is included “perhaps to reward a collaborator, return a favor, or for some other personal or financial gain” (The Royal Society, n.d.).
Conflicts of interest: Conflicts of interest occur when researchers or their affiliates stand to gain advantage from the research outcomes (H2020, 2022). Any actual or perceived conflict of interest must be declared; otherwise, the research may be considered suspect and undergo scrutiny for potential conflict of interest violations and potential fraud. Publications routinely ask for declarations of conflicts of interest as a matter of transparency. If any potential for such conflict exists, it’s best to declare it to maintain your credibility.
Non-disclosure of safety procedures: When safety procedures must be followed to complete the research with reduced risk to the researchers and participants, those procedures must be shared with them. It is how they can protect their own safety. A lack of disclosure of those safety procedures and other important information constitutes a withholding of critical information and is considered unethical.
If you approach your research with good intentions and a proactive mindset that is determined to do the right thing all the way through, you should have positive outcomes. Following the principles for ethical research and avoiding the ethical violations that will call your work into question will enable you to focus on the work at hand rather than on often time-consuming scrutiny.
Content relating to the Belmont Report was obtained from Human Research Ethics (2022) by Suzan Last; Sarah LeMire; and Nicole Hagstrom-Schmidt. Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
References
Adams, N. N. (2022). Salami Slicing: clarifying common misconceptions for social science early-career researchers | SN Social Sciences (2)
Arnold, M., and Levin, S. (2021). Plagiarism vs Copyright Infringement | Copyright Alliance
CRT. (2025). CRediT – Contributor Role Taxonomy
Digital Governance Council. (2024). Digital Governance Council –
Elsevir. (2024). Research Fraud: Falsification and Fabrication of Data
Enago Academy. (2021). Research and publication ethics [Infographic]. Research ethics and misconduct: What researchers need to know. https://www.enago.com/academy/principles-of-ethical-research/
Government of Canada. (2018/2020, February 19, modified). Tri-Council Policy Statement: Ethical conduct for research involving humans. Panel on Research Ethics. https://ethics.gc.ca/eng/tcps2-eptc2_2018_introduction.html
H2020. (2022). H2020 INTEGRITY - Conflict of interest in research: what is it and why it matters?
Human Research Standards Council. (n.d.) Home - HRSO
Mollick, E. (2024, October 20). Thinking Like an AI - by Ethan Mollick - One Useful Thing Substack.
National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research (1979, 18 April). The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects Research, U.S. Department of Health, Education, and Welfare https://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/read-the-belmont-report/index.html ↵
Office of the Privacy Commissioner of Canada. (n.d.) The Personal Information Protection and Electronic Documents Act (PIPEDA) - Office of the Privacy Commissioner of Canada
Seneca College. (2007, October 31). Ethical Conduct for Research Involving Human Subjects Policy. https://www.senecacollege.ca/about/policies/ethical-conduct-for-research-involving-human-subjects-policy.html
Primary research undertaken when embarking on any large scale project will most likely include “public engagement,” or stakeholder consultation. Public engagement is the broadest term used to describe the increasingly necessary process that companies, organizations, and governments must undertake to achieve a “social license to operate.” Stakeholder engagement includes humans in the decision making process and can range from simply informing the public about plans for a project, to engaging them in more consultative practices like getting input and feedback from various groups, and even to empowering key community stakeholders in the final decisions.
For projects that have social, economic, and environmental impacts, and especially for those that foster an Indigenous World View, respect for the Sustainable Development Goals, respect for the rights of Indigenous peoples, and a commitment to social change, stakeholder consultation is an increasingly critical part of the planning stage. Creating an understanding of how projects will affect a wide variety of stakeholders is beneficial for both the company initiating the project and the people who will be affected by it. Listening to stakeholder feedback and concerns can be helpful in identifying and mitigating risks that could otherwise slow down or even derail a project. It can also be an opportunity to build into the project values and actions that work towards improving environmental conditions as well as uplifting communities and the individuals who belong there. For stakeholders, the consultation process creates an opportunity to be informed, as well as to inform the company about local contexts that may not be obvious, to raise issues and concerns, and to help shape the objectives and outcomes of the project.
What is a Stakeholder?
Stakeholders include any individual or group who may have a direct or indirect “stake” in the project – anyone who can be affected by it, or who can have an effect on the actions or decisions of the company, organization, or government. They can also be people who are simply interested in the matter, but more often they are potential beneficiaries or risk-bearers. They can be internal – people from within the company or organization (owners, managers, employees, shareholders, volunteers, interns, students, etc.) – and external, such as community members or groups, investors, suppliers, consumers, policy makers, etc. Increasingly, arguments are being made for considering non-human stakeholders such as the natural environment (Driscoll & Starik, 2004). The following video, Identifying Stakeholders (2018) further explains the process of identifying stakeholders.
Stakeholders can contribute significantly to the decision-making and problem-solving processes. People most affected by the problem and most directly impacted by its effects can help you
- understand the context, issues and potential impacts,
- determine your focus, scope, and objectives for solutions, and
- establish whether further research is needed into the problem.
People who are attempting to solve the problem can help you
- refine, refocus, prioritize solution ideas,
- define necessary steps to achieving them, and
- implement solutions, provide key data, resources, etc.
There are also people who could help solve the problem, but lack awareness of the problem or their potential role. Consultation processes help create the awareness of the project to potentially get these people involved during its early stages.
Knowledge Check
Stakeholder Mapping
The more a stakeholder group will be materially affected by the proposed project, the more important it is for them to be identified, properly informed, and encouraged to participate in the consultation process. Determining who the various stakeholders are, as well as their level of interest in the project, the potential impact it will have on them, and the power they have to shape the process and outcome is critical. You might start by brainstorming or mind-mapping all the stakeholders you can think of. See Figure 7.4.1 as an example.

LLMs can be used for the mapping process and can reveal stakeholders that you would not think of on your own.
Asking the LLM to Generate a Differentiated List of Stakeholders
You can adapt the following example to create a prompt that would assist in identifying stakeholders using a LLM. For the process to be effective, you will need to include specific information about the proposed project or issue, being mindful of your company's confidentiality policy.
The City of Toronto is thinking of implementing an automated traffic citation system. Before implementation, the City will be consulting with various community stakeholders, including small and large businesses, residents, local community groups, health care and social work agencies, and Indigenous groups. The City will also include in the consultation process law enforcement and potential vendors. Please help me to develop a categorized list of potential stakeholders for this project. I want to classify those who will offer strong support, who would be neutral, and those in opposition.
Once stakeholders who may be impacted have been identified, they can be organized into categories or a matrix. One standard method of organizing stakeholders is to determine which are likely to be in support of the project and which are likely to oppose it, and then determine how much power or influence each of those groups has (see Figure 7.4.2 for a visualization of such a matrix). To enlist the LLM for assistance follow this process:

For example, a mayor of a community has a strong level of influence. If the mayor is in full support of the project, this stakeholder would go in the top right corner of the matrix. Someone who is deeply opposed to the project, but has little influence or power, would go at the bottom left corner.
A matrix like this can help you determine what level of engagement is warranted: where efforts to “consult and involve” might be most needed and most effective, or where more efforts to simply “inform” might be most useful. You might also consider the stakeholders’ level of knowledge on the issue, level of commitment (whether in support or opposed), and resources available.
As you proceed through the stakeholder mapping and especially when using an LLM to assist, be alert for the potential for bias and exclusion. The process is intended to draw as many relevant stakeholders into the consultation. In addition, there may be other factors that should be included in the matrix, so think beyond the boundaries of this example. The factors will be determined by the project and populations involved. Being aware of personal, group, and institutional bias along with contextual variables, for example, will help with keeping a more open mindset and account for a multitude of insights needed to proceed thoroughly.
Levels of Stakeholder Engagement
Levels of engagement can range from simply informing people about what you plan to do to actively seeking consent and placing the final decision in their hands. This range, presented in Figure 7.4.3, is typically presented as a “spectrum” or continuum of engagement from the least to most amount of engagement with stakeholders.

Review the following video, What is Stakeholder Engagement? (2020), for an additional overview of this process.
Depending on the type of project, the potential impacts and the types and needs of stakeholders, you may engage in a number of levels and strategies of engagement across this spectrum using a variety of different tools (see Table 7.4.1). Your approach may focus on one or several of these:
- Inform: Provide stakeholders with balanced and objective information to help them understand the project, the problem, and the solution alternatives. (There is no opportunity for stakeholder input or decision-making.)
- Consult: Gather feedback on the information given. The level of input can range from minimal interaction (online surveys, etc.) to extensive. It can be a one-time contribution or ongoing/iterative opportunities to give feedback to be considered in the decision-making process.
- Involve: Work directly with stakeholders during the process to ensure that their concerns and desired outcomes are fully understood and taken into account. Final decisions are still made by the consulting organization, but with well-considered input from stakeholders.
- Collaborate: Partner with stakeholders at each stage of decision-making, including developing alternative solution ideas and choosing the preferred solution together. The goal is to achieve consensus regarding decisions.
- Empower: Place final decision-making power in the hands of stakeholders. Voting ballots and referenda are common examples. This level of stakeholder engagement is rare and usually includes a small number of people who represent important stakeholder groups.
Table 7.4.1 Typical tools for public engagement
Inform | Consult | Involve / Collaborate / Empower |
---|---|---|
|
|
|
Knowledge Check
The Consultation Process: Basic Steps
There is no single “right” way of consulting with stakeholders. Each situation will be different so each consultation process will be context-specific and will require a detailed plan. A poorly planned consultation process can lead to a lack of trust between stakeholders and the company. Therefore, it is critical that the process be carefully mapped out in advance, and that preliminary work is done to determine the needs and goals of the process and the stakeholders involved. In particular, ensure that whatever tools you use are fully accessible to all the stakeholders you plan to consult. For example, an online survey is not of much use to a community that lacks robust Wi-Fi infrastructure. Consider the following steps to structure your consultation process:
- Situation Assessment: Who needs to be consulted about what and why? Define internal and external stakeholders, determine their level of involvement, interest level, and potential impact, their needs and conditions for effective engagement.
- Goal-setting: What is your strategic purpose for consulting with stakeholders at this phase of the project? Define clear understandable goals and objectives for the role of stakeholders in the decision-making process. Determine what questions, concerns, and goals the stakeholders will have and how these can be integrated into the process. Integrate applicable Sustainable Development Goals from the outset so they are not treated as an afterthought.
- Planning/Requirements: Based on situation assessment and goals, determine what engagement strategies to use and how to implement them to best achieve these goals. Ensure that strategies consider issues of accessibility and inclusivity and consider vulnerable populations. Consider legal or regulatory requirements, policies, or conditions that need to be met. For example, because the research will involve humans, the project should be approved by a Research Ethics Board prior to engagement with the public. (For more information on research involving human subjects, go to Chapter 7.5.) During this phase, you should also determine how you will collect, record, track, analyze and disseminate the data.
- Process and Event Management: In this phase, you will devise strategies to keep the planned activities moving forward and on-track, and adjust strategies as needed. Be sure to keep track of all documentation.
- Evaluation: Design an evaluation metric to gauge the success of the engagement strategies; collect, analyze, and act on the data collected throughout the process. Determine how will you report the results of the engagement process back to the stakeholders.
In situations when projects will affect Indigenous communities, land, and resources, you are obligated by the United Nations Office of the High Commissioner on Human Rights to apply the practice of "free, prior, and informed consent of indigenous peoples" (OHCHR, 2013).
Communicating Effectively in Stakeholder Engagement
Effective communication is the foundation of stakeholder consultation. The ability to create and distribute effective information, develop meaningful relationships, build trust, and listen to public input is essential.
The basic communication skills required for any successful stakeholder engagement project include:
- Effective Writing: the ability to create clear and concise written messages using plain language and structural conventions.
- Visual Rhetoric: the ability to combine words, images, and graphics to support, clarify, and illustrate ideas and make complex issues understandable to a general audience.
- Public Speaking/Presenting: the ability to present information to large audiences in a comfortable and understandable way. The ability to create effective visual information that increases the audience’s understanding.
- Interpersonal and Intercultural Skills: the ability to relate to people in face-to-face situations, to make them feel comfortable and secure, and to be mindful of cultural factors that may affect interest level, accessibility, impact, values, or opinions.
- Collaboration Skills: the ability to work effectively with little friction with team members. Collaboration typically involves frequent and open communication, cooperation, good will, information sharing, problem-solving, and empathy.
- Active Listening: the ability to focus on the speaker and portray the behaviours that provide them with the time and safety needed to be heard and understood. The ability to report back accurately and fully what you have heard from participants.
Note that when communicating with Indigenous Elders and Chiefs, specific protocols should be followed to honour the traditions, culture, and history of Indigenous people.
Caring for Your Stakeholders: Introductory Principles
Most often, stakeholder engagement and consultation will obviously involve the participation of humans since they are the ones who will be providing the input needed to move a project forward. Because in the past some has been conducted that inflicted harm on participants, strict guidelines have been developed for the ethical treatment of humans in research. When first embarking on a human-involved research project, you must first check-in with your professor or workplace manager, who will determine if an application for approval must be submitted to the Research Ethics Board (REB) at the institution or organization. This approval process helps to ensure that the research methods as well as data collection, storage, and use will all align with ethical practices and laws.
The basic guidelines in the chart below are typically part of the process of ensuring that you are transparent and careful with how you approach your research and human participants and subjects. Be reminded that when gathering, storing, and using data, especially that gathered from people, you must abide by the Canadian federal and provincial privacy laws. Please refer to the Personal Information Protection and Electronic Documents Act (PIPEDA) and Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA) for more information. See more about protecting people's information in Chapter 7.5 Research Ethics.
Recruiting Participants
When recruiting potential participants, you must give them the following information before you begin:
- Researcher name(s): Inform them of your name and contact information
- Affiliation: Provide a) the name of your institution or organization, b) your department, and c) your instructor's or manager's name and contact information
- Project Description: Include a brief description of the project in lay language that can be understood by the participants and clearly identifies the problem being addressed; mention the names of key people on your project team
- Purpose: Describe the purpose of your research or consultation/engagement project (objectives), and the benefits you hope will come from this project (overall goal). Your activity should not involve any deception (e.g., claiming to be gathering one kind of information, such as "do you prefer blue or green widgets?", but actually gathering another kind, such as "what percentage of the population is blue/green colour blind?"). See Chapter 7.5 for more on the topic of ethics violations.
- Use of information: Inform participants of the way in which the results will be presented and/or disseminated.
Informed Consent
You must gain the informed consent of the people you will be surveying, interviewing, or observing. This can be done using a consent form they can sign in person, or through an “implied consent” statement such as on an electronic survey. The consent form should include all the information in the “recruiting” section above. In addition, you should
- include a full description of all data collection methods, procedures, and instruments, as well as expectations regarding the amount of time required for participation
- inform participants that their participation is voluntary and that they may withdraw at any time without consequence, even if they have not completed the survey or interview,
- disclose any and all risks or discomfort that participation in the study may involve, and how these risks or discomfort will be addressed, and
- ensure that all participants are adults (19 years of age or older) and fully capable of giving consent; do not recruit from vulnerable or at-risk groups, and do not collect demographic data regarding age, gender, or any other information not relevant to the study (e.g., phone numbers, medications they are taking, whether they have a criminal records, etc.).
- include a method by which the participants can signal their consent or include an implied consent statement as as the following: "By completing and submitting this survey, you are indicating your consent for the collection and use of information provided."
Managing the Data
Participants should be told what will happen to the data you gather:
- Survey data are anonymous if you do not track who submitted responses. In anonymous surveys, let participants know that once they submit their survey, it cannot be retrieved and removed from the overall results.
- Inform participants of the means by which their anonymity will be protected and data will be kept confidential as well as how the raw data, including tapes, digital recordings, notes, and other types of data will be disposed of at the end of the project.
- Let survey participants know a) that your research results will be reported without their names and identifiers, b) where the data will be stored, c) how it will be "published", and d) what will happen to the raw data once your project is complete.
- Let interview participants know how their information will be used and if their names will be included or cited.
There may be additional issues that must addressed, such as accessibility and cultural considerations, but those listed above are typically included. If you are unsure whether a particular line of inquiry or method of data collection requires ethics approval, you should ask your instructor or manager. Most importantly, you should always be completely transparent and honest about what and how you are researching.
It may seem like “a lot of fuss” to go through simply to ask people whether they prefer blue widgets or green widgets, but there are important reasons for these guidelines. People participating in your research need to be reassured that you are doing this for a legitimate reason and that the information you are gathering will be treated with respect and care.
For more information about research ethics, see Chapter 7.5.
Asking Questions in Surveys and Interviews (adapted from Divjak, 2007-2025)
Questionnaire Surveys
Invented by Sir Francis Galton, a questionnaire is a research instrument consisting of a set of questions (items) intended to capture responses from respondents in a standardized manner. Questions should be designed in such a way that respondents are able to read, understand, and respond to them in a meaningful way. Surveys are commonly used to gather information from stakeholders because they are efficient in gathering lots of data, are of low cost, and, when designed well, can reveal valuable information.
Here are the steps involved applying a survey in your study:

Constructing a survey questionnaire is an art. Numerous decisions must be made about the content of questions, their wording, format, and sequencing, all of which can have important consequences for the survey responses. In the process, you must use the ethical principles discussed in Chapter 7.5 to guide you. Primarily, you would want to avoid doing anything that would cause physical or emotional harm to your participants. For example, carefully word sensitive or controversial questions and avoid inserting unintended bias or asking leading questions. You want to design questions to get meaningful and accurate responses rather than ambiguous information that is impossible to quantify or analyze. As a result, constructing a survey questionnaire is not a linear straightforward process. Instead, it is an iterative process, in which you would most probably need to produce several versions of the questionnaire and revise them a few times before coming up with a final version of your questionnaire.
Where to start? To write effective survey questions begin by identifying what you wish to know. In other words, refer to the research questions (general and specific) that have guided your research process to ensure that you collect all relevant data through the survey.
Let’s say you want to understand how students make a successful transition from high school to college. Perhaps you wish to identify which students are comparatively more or less successful in this transition and which factors contribute to students’ success or lack thereof. Let’s suppose you have set up the following general research question: Which factors contribute to students’ success or failure in the process of transition from high school to university? To understand which factors shape (un)successful students’ transitions to university, you’ll need to include questions in your survey about all the possible factors that could affect this transition. Consulting the literature on the topic will certainly help, but you should also take the time to do some brainstorming on your own and to talk with others about what they think may be important. Perhaps time or space limitations won’t allow you to include every single item you’ve come up with, so you’ll also need to rank your questions so that you can be sure to include those that are the most important.
LLMs can assist in drafting survey questions when they are provided with specific information about your project. Remember that the LLM output will often produce generalized information and content, so careful revision of the questions and revising them for your specific research goals is essential.
Asking the LLM to Draft Survey Questions (adapted from Microsoft Copilot, 2025)
Use the following key elements in your prompt when asking a LLM to draft survey questions:
- Purpose of the Survey: Clearly state the objective of the survey. This helps the LLM understand the context and generate relevant questions.
- Example: "Create survey questions that will help to assess the inclusivity of our workplace environment [specify the type of business or industry]."
- Target Audience: Specify who the survey is intended to address (respondents). This ensures the questions are appropriate for the respondents.
- Example: "The survey is for all employees across different departments and levels."
- Type of Questions: Indicate the types of questions you want (e.g., multiple choice, Likert scale, open-ended*).
- Example: "Include a mix of multiple choice, Likert scale, and open-ended questions."*Find out about question types in the sections below.
- Key Topics or Areas: List the main topics or areas you want to cover in the survey.
- Example: "Focus on experiences of inclusion, diversity, and equity in the workplace."
- Number of Questions: Specify the desired number of questions.
- Example: "Generate 12 questions."
- Tone and Language: Mention any specific tone or language preferences.
- Example: "Use a respectful and inclusive tone."
- Additional Instructions: Include any other specific instructions or constraints.
-
- Example: "Ensure questions are sensitive to diverse backgrounds and identities."
Here's a sample prompt incorporating these elements:
Prompt for LLM to Create Survey Questions
Help me to develop a survey to assess the inclusivity of our workplace environment. The survey is intended for all employees across different departments and levels. Include a mix of multiple choice, Likert scale, and open-ended questions. I want to focus on experiences of inclusion, diversity, and equity in the workplace. Guide me as I generate 12 questions using a respectful and inclusive tone. Ensure that I develop questions are sensitive to diverse backgrounds and identities.
By including these elements, you can provide clear and comprehensive instructions to LLMs, ensuring the generated survey questions are relevant and effective in assessing the topic area you are investigating.
Response Formats
Questions may be unstructured (open-ended) or structured (closed-ended). Unstructured questions ask respondents to provide a response in their own words, while structured questions ask respondents to select an answer from a given set of choices (response options).
Generally, structured (closed-ended) questions predominate in surveys but sometimes survey researchers include open-ended questions as a way to gather additional details from respondents. An open-ended question does not include response options; instead, respondents are asked to reply to the question using their own words. These questions are generally used to find out more about a survey participant’s experiences or feelings about the topic. If, for example, a survey includes closed-ended questions asking respondents to report on their involvement in extracurricular activities during college, an open-ended question could ask respondents why they participated in those activities or what they gained from their participation. While responses to such questions may also be captured using a closed-ended format, allowing participants to share some of their responses in their own words can make the experience of completing the survey more satisfying to them and can reveal motivations or explanations that had not occurred to the researcher.
There shouldn’t be too many unstructured (open-ended) questions included in surveys, as respondents might be reluctant to write down detailed, elaborated, and in-depth responses to such questions. Apart from that, there are other reasons behind this rationale (e.g., complexity of analyzing open-ended responses, especially in the case of large samples).
Designing Good Survey Questions
Responses obtained in survey research are very sensitive to the types of questions asked. Poorly framed or ambiguous questions will likely result in responses with very little value. Every question in a survey should be carefully scrutinized for the following issues:
Is the question clear and understandable?
Survey questions should be stated in clear and straight-forward language, preferably in active voice, and without complicated words or jargon that may not be understood by a typical respondent. The only exception is if your survey is targeted at a specialized group of respondents, such as doctors, lawyers and researchers, who use such jargon in their everyday environment. Make sure that the language you are using is concise and precise; avoid metaphoric expressions.
In the above example of the transition to college, the criterion of clarity would mean that respondents must understand what exactly you mean by “transition to college”. If you are going to use that phrase in your survey you need to clearly define what the phrase refers to.
Also, pay attention to the usage of words that may have different meanings, e.g., in different parts of the country or among different segments of survey respondents. Let’s take an example of the word “wicked”. This term could be associated with evil; however, it in contemporary subcultures, it means "very," "really," "extremely" (Nikki, 2004).
Is the question worded in a negative manner?
Negatively-worded questions should be avoided, and in all cases, avoid double-negatives. Such questions as "Should your local government not raise taxes?" and “Did you not drink during your first semester of college?” tend to confuse many respondents and lead to inaccurate responses. Instead, you could ask: "What is your opinion on your local government intention to raise taxes?" and "Did you drink alcohol in your first semester of college?" These examples are obvious, but hopefully clearly illustrate the importance of careful wording for questions.
Is the question ambiguous?
Survey questions should not contain words or expressions that may be interpreted differently by different respondents (e.g., words like “any” or “just”). For instance, if you ask a respondent "What is your annual income?", it is unclear whether you referring to salary/wages, or also dividend, rental, and other income, whether you referring to personal income, family income (including spouse’s wages), or personal and business income? Different interpretation by various respondents will lead to incomparable responses that cannot be interpreted correctly.
Does the question have biased or value-laden words?
Bias refers to any property of a question that encourages subjects to answer in a certain way. Kenneth Rasinky (1989, cited in Bhattacherjee, 2012) examined several studies on people’s attitude toward government spending, and observed that respondents tend to indicate stronger support for “assistance to the poor” and less for “welfare”, even though both terms had the same meaning. In this study, more support was also observed for “halting rising crime rate” (and less for “law enforcement”), “solving problems of big cities” (and less for “assistance to big cities”), and “dealing with drug addiction” (and less for “drug rehabilitation”). A biased language or tone tends to skew observed responses. It is often difficult to anticipate in advance the biasing wording, but to the greatest extent possible, survey questions should be carefully scrutinized to avoid biased language.
Is the question on two topics?
Avoid creating questions that are on two topics because the respondent can only respond to one of the topics at a time. For example, "Are you satisfied with the hardware and software provided for your work?" In this example, how should a respondent answer if he/she is satisfied with the hardware but not with the software or vice versa? It is always advisable to separate two-topic questions into separate questions: (1) "Are you satisfied with the hardware provided for your work?" and (2) "Are you satisfied with the software provided for your work?" Another example of a confusing question: "Does your family favour public television?" Some people in the family may favour public TV for themselves, but favour certain cable TV programs such as Sesame Street for their children. Ensure that your questions clearly address one topic.
Is the question too general?
Sometimes, questions that are too general may not accurately convey respondents’ perceptions. If you asked someone how they liked a certain book and provide a response scale ranging from “not at all” to “extremely well”, if that person selected “extremely well”, what does that mean? Instead, ask more specific behavioural questions, such as "Will you recommend this book to others?" or "Do you plan to read other books by the same author?" Likewise, instead of asking "How big is your firm?" (which may be interpreted differently by respondents), ask "How many people work for your firm, and/or what is the annual revenues of your firm?" which are both measures of firm size.
Is the question asking for too much detail?
Avoid asking for detail that serves no specific research purpose. For instance, do you need to have the age of each child in a household or is just the number of children in the household acceptable? If unsure, however, it is better to err on the side of details than generality.
Is the question leading or presumptuous?
If you ask, "What do you see are the benefits of a tax cut?" you are presuming that the respondent sees the tax cut as beneficial. But many people may not view tax cuts as being beneficial because tax cuts generally lead to lesser funding for public schools, larger class sizes, and fewer public services such as police, ambulance, and fire service. So avoid leading or presumptuous questions. Instead, first ask about the perception of the tax cut and then direct the question on the benefits of the tax cut only to those respondents who perceive tax cut as beneficial (see the section on filter questions below).
Is the question relevant only for particular segments of respondents?
If you decide to pose questions about matters with which only a portion of respondents will have had experience, introduce a filter question into your survey. A filter question is designed to identify a subset of survey respondents who are asked additional questions that are not relevant to the entire sample. Perhaps in your survey on the transition to college you want to know whether substance use plays any role in students’ transitions. You may ask students how often they drank alcohol during their first semester of college. But this assumes that all students drank. Certainly, some may have abstained, and it wouldn’t make any sense to ask the nondrinkers how often they drank. So the filter would direct those who drank to additional questions, while other respondents proceed through the new question.
Is the question on a hypothetical situation?
A popular question in many television game shows is “If you won a million dollars on this show, how do you plan to spend it?” Most respondents have never been faced with such an amount of money and have never thought about it, and so their answers tend to be quite random, such as take a tour around the world, buy a restaurant, spend on education, save for retirement, help parents or children, or have a lavish wedding. Hypothetical questions have imaginary answers, which cannot be used for making scientific inferences.
Do respondents have the information needed to correctly answer the question?
Often, we assume that subjects have the necessary information to answer a question, when in reality, they do not. Even if responses are obtained, in such case, the responses tend to be inaccurate, given the respondents' lack of knowledge about the question being asked. For instance, we should not ask the CEO of a company about day-to-day operational details about which they may have no knowledge.
In the example of the transition to college, the respondents must have actually experienced the transition to college themselves in order for them to be able to answer the survey questions.
Does the question tend to elicit socially desirable answers?
In survey research, social desirability refers to the idea that respondents will try to answer questions in a way that will present them in a favourable light. Perhaps we decide that to understand the transition to college, we need to know whether respondents ever cheated on an exam in high school or college. We all know that cheating on exams is generally frowned upon. So, it may be difficult to get people to admit to such cheating on a survey. But if you can guarantee respondents’ confidentiality, or even better, their anonymity, chances are much better that they will be honest about having engaged in this socially undesirable behaviour. Another way to avoid problems of social desirability is to try to phrase difficult questions in the most benign way possible. Earl Babbie (2010) offers a useful suggestion for helping you do this—simply imagine how you would feel responding to your survey questions. If you would be uncomfortable, chances are others would as well.
In sum, in order to pose effective and proper survey questions, you should do the following:
- Identify what it is you wish to find out (define proper research questions).
- Keep questions clear and simple.
- Make questions relevant to respondents.
- Use filter questions when necessary.
- Avoid questions that are likely to confuse respondents such as those that use double negatives, use culturally specific terms, or pose more than one question in the form of a single question.
- Imagine how you would feel responding to questions.
Get feedback, especially from the people who resemble those in your sample (small scale pilot study), before you send it to your respondents.
Response Options
In the case of structured (closed-ended) questions you should carefully consider the response options. Response options are the answers that you provide to the people taking your survey, and they are usually captured using one of the following response formats:
- Dichotomous response, where respondents are asked to select one of two possible choices, such as true/false, yes/no, or agree/disagree. An example of such a question is: Do you think that the death penalty is justified under some circumstances? (circle one): yes / no.
- Nominal response, where respondents are presented with more than two unordered options, such as: What is your industry of employment?— manufacturing / consumer services / retail / education / healthcare / tourism & hospitality / other
- Ordinal response, where respondents have more than two ordered options, such as: What is your highest level of education?—high school / college degree / graduate studies.
- Interval-level response, where respondents are presented with a 5-point or 7-point interval scale, such as: To what extent do you agree with the following statement [insert statement]: strongly disagree / disagree / neither agree nor disagree / agree / strongly agree.
- Continuous response, where respondents enter a continuous (ratio-scaled) value with a meaningful zero point, such as their age, weight or year of birth (How old are you? What is your weight in kilograms? Which year you were born?). These responses generally tend to be of the fill-in-the blanks type.
- Rank ordering the response options, such as "Rank the five elements that influenced your choice of the university program from the most to the least influential; assign number 1 to the most influential element and number 5 to the least influential element; properly assign numbers 2-4 to the rest of the elements."
Generally, respondents will be asked to choose a single (or best) response to each question you pose, though certainly it makes sense in some cases to instruct respondents to choose multiple response options. One caution to keep in mind when accepting multiple responses to a single question, however, is that doing so may add complexity when it comes to analysing your survey results. Nevertheless, for each closed-ended question, clearly instruct the respondents on the number of response options they are required to choose.
Guidelines for Designing Response Options
Here are a few guidelines worth following when designing the response options.
Ensure that your response options are mutually exclusive. In other words, there should be no overlapping categories in the response options. In the example about the frequency of alcohol consumption, if we ask “On average, how many times per week did you consume alcoholic beverages during your first semester of college?”, we may then provide the following response options: a) less than one time per week, b) 1-2, c) 3-4, d) 5-6, e) 7+. Do you notice that there are no overlapping categories in the response options for these questions?
Create response options that are exhaustive. In other words, every possible response should be covered in the set of response options. In the example shown in the paragraph above, we have covered all possibilities: those who drank, say, an average of once per month can choose the first response option (“less than one time per week”) while those who drank multiple times a day each day of the week can choose the last response option (“7+”). All the possibilities in between these two extremes are covered by the middle three response options. When you are unsure about capturing all response options, you can add the option “other” to the list to enable the respondents to specify their response in their own words. This is particularly useful in case of nominal unorder response options.
If there is a reason to believe that not all respondents would be able to select from the given response options, then add the “not able to answer” or “not applicable to me” response to the response options list, in order to ensure the validity of collected data.
Use a question matrix when answer categories are identical. Using a matrix is a nice way of streamlining response options. A matrix is a question type that lists a set of questions for which the answer categories are all the same (Table 7.4.2). If you have a set of questions for which the response options are all the same, create a matrix rather than posing each question and its response options individually. Not only will this save you some space in your survey but it will also help respondents progress through your survey more easily and quickly.
Table 7.4.2 Sample Question Matrix
Online learning… | Strongly disagree | Disagree | Nether disagree nor agree | Agree | Strongly agree |
…is more difficult than traditional in-class learning. | 1 | 2 | 3 | 4 | 5 |
…only appropriate for part-time employed students. | 1 | 2 | 3 | 4 | 5 |
…will become predominant mode of learning in the near future. | 1 | 2 | 3 | 4 | 5 |
Ensure that response options are aligned with the question wording and vice versa. For example, if you have a yes/no question type, then you should only provide the yes and no response options (e.g., “Do you support the Prime Minister in his endeavours to foster respectful communication?”--yes/no). If you want to know about the frequency of an event (let’s say the frequency of alcohol consumption during college), then the yes/no question (e.g., “Did you consume alcohol during college?”) would be inappropriate. It would also be inappropriate to have a yes/no question type and ordered response options (e.g. every day in a week, once per week etc.). This is where attention to detail and proofreading skills come into play.
Questionnaire Design
In addition to constructing quality questions and posing clear response options, you’ll also need to think about how to present your written questions and response options to survey respondents. Designing questionnaires takes some thought, and in this section, we’ll discuss what you should think about as you prepare to present your well-constructed survey questions on a questionnaire.
In general, questions should flow logically from one to the next. To achieve the best response rates, questions should flow from the least sensitive to the most sensitive, from the factual and behavioural to the attitudinal, and from the more general to the more specific.
One of the first things to do once you’ve come up with a good set of survey questions is to group those questions thematically. In our example of the transition to college, perhaps we’d have a few questions asking about study habits, others focused on friendships, and still others on exercise and eating habits. Those may be the themes around which we organize our questions. Or perhaps it would make more sense to present any questions we had about precollege life and habits and then present a series of questions about life after beginning college. The point here is to be deliberate about how you present your questions to respondents.
Once you have grouped similar questions together, you’ll need to think about the order in which to present those question groups. Most survey researchers agree that it is best to begin a survey with questions that will want to make respondents continue (Babbie, 2010; Dillman, 2000; Neuman, 2003). In other words, don’t bore respondents, but don’t scare them away either.
There’s some disagreement over where on a survey to place demographic questions such as those about a person’s age, gender, and race. On the one hand, placing them at the beginning of the questionnaire may lead respondents to mistake the purpose of the survey. On the other hand, if your survey deals with some very sensitive or difficult topic, such as workplace racism or sexual harassment, you don’t want to scare respondents away or shock them by beginning with your most intrusive questions. Generally, it's advisable to put demographic questions in the end of the questionnaire, unless it is required to have them at the beginning, for instance if particular survey questions serve as filter questions.
In truth, the order in which you present questions on a survey is best determined by the unique characteristics of your research—only you, the researcher, and in consultation with colleagues, can determine how best to order your questions. To do so, think about the unique characteristics of your topic, your questions, and most importantly, your sample. Keeping in mind the characteristics and needs of the potential respondents should help guide you as you determine the most appropriate order in which to present your questions.
Also consider the time it will take respondents to complete your questionnaire. Surveys vary in length, from just a page or two to a dozen or more pages, which means they also vary in the time it takes to complete them. How long to make your survey depends on several factors. First, what is it that you wish to know? Wanting to understand how grades vary by gender and year in school certainly requires fewer questions than wanting to know how people’s experiences in college are shaped by demographic characteristics, college attended, housing situation, family background, college major, friendship networks, and extracurricular activities. Even if your research question requires a good number of survey questions, do your best to keep the questionnaire as brief as possible. Any hint that you’ve thrown in unnecessary questions will turn off respondents and may make them not want to complete your survey.
Second, and perhaps more important, consider the amount of time respondents would likely be willing to spend completing your questionnaire? If you are studying college students, your survey would be taking up valuable study time, so they won’t want to spend more than a few minutes on it. The time that survey researchers ask respondents to spend on questionnaires varies greatly. Some advice that surveys should not take longer than about 15 minutes to complete (cited in Babbie, 2010), others suggest that up to 20 minutes is acceptable (Hopper, 2010). This would be applicable to self-completion surveys, while surveys administered personally face-to-face by an interviewer asking questions, may even take up to an hour or more. As with question order, there is no clear-cut rule for how long a survey should take to complete. The unique characteristics of your study and your sample should be considered in order to determine how long to make your questionnaire.
A good way to estimate the time it will take respondents to complete your questionnaire is through pretesting (piloting). Pretesting allows you to get feedback on your questionnaire so you can improve it before you actually administer it. Pretesting is usually done on a small number of people (e.g., 5 to 10) who resemble your intended sample and to whom you have easy access. By pretesting your questionnaire, you can find out how understandable your questions are, get feedback on question wording and order, find out whether any of your questions are unclear, overreach, or offend, and learn whether there are places where you should have included filter questions, to name just a few of the benefits of pretesting. You can also time pre-testers as they take your survey. Ask them to complete the survey as though they were actually members of your sample. This will give you a good idea about what sort of time estimate to provide respondents when it is administered, and whether you are able to add additional questions/items or need to cut a few questions/items.
Perhaps this goes without saying, but your questionnaire should also be attractive. A messy presentation style can confuse respondents or, at the very least, annoy them. Use survey design tools like Google Forms, Microsoft Forms to create a consistent design and to automate the collection and visualization of the data collected.
In sum, here are some general rules regarding question sequencing and questionnaire design:
- Brainstorming and consulting your research to date are two important early steps to take when preparing to write effective survey questions.
- Start with easy non-threatening questions that can be easily recalled.
- Never start with an open-ended question.
- If following an historical sequence of events, follow a chronological order from earliest to latest.
- Ask about one topic at a time (group the questions meaningfully around common topics). When switching topics, use a transition, such as “The next section examines your opinions about …”.
- Make sure that your survey questions will be relevant to all respondents and use filter or contingency questions as needed, such as: “If you answered “yes” to question 5, please proceed to Section 2. If you answered “no” go to Section 3.” Automated surveys will guide respondents to the next question based on the filters provided.
Always pretest your questionnaire before administering it to respondents in a field setting. Such pretesting may uncover ambiguity, lack of clarity, or biases in question wording, which should be eliminated before administering to the intended sample.
Qualitative Interviews
Qualitative interviews are sometimes called intensive or in-depth interviews. These interviews are semi-structured; the researcher has a particular topic about which he or she would like to hear from the respondent, but questions are open-ended and may not be asked in exactly the same way or in exactly the same order to each respondent. In in-depth interviews, the primary aim is to hear from respondents about what they think is important about the topic at hand and to hear it in their own words. In this section, we’ll take a look at how to conduct interviews that are specifically qualitative in nature, and use some of the strengths and weaknesses of this method.
Qualitative interviews might feel more like a conversation than an interview to respondents, but the researcher is in fact usually guiding the conversation with the goal of gathering information from a respondent. A key difference between qualitative and quantitative interviewing is that qualitative interviews contain open-ended questions without response options. Open-ended questions are more demanding of participants than closed-ended questions, for they require participants to come up with their own words, phrases, or sentences to respond. In qualitative interviews you should avoid asking closed-ended questions, such as yes/no questions or questions which are possible to be answered with short replies (e.g., in a couple of words or phrases).
In a qualitative interview, the researcher usually develops a guide in advance that he or she then refers to during the interview (or memorizes in advance of the interview). An interview guide is a list of topics or questions that the interviewer hopes to cover during the course of an interview. It is called a guide because any good interview is a conversation, and one topic can lead to another that is not on your list. Interview guides should outline issues that a researcher feels are likely to be important, but because participants are asked to provide answers in their own words, and to raise points that they believe are important, each interview is likely to flow a little differently. While the opening question in an in-depth interview may be the same across all interviews, from that point on what the participant says will shape how the interview proceeds. Here, excellent listening skills, knowledge of your goals, and an intuitive approach work together to obtain the information you need from the interviewee. It takes a skilled interviewer to be able to ask questions; listen to respondents; and pick up on cues about when to follow up, when to move on, and when to simply let the participant speak without guidance or interruption.
Like in case of a questionnaire surveys, the interview guide should be derived from the research questions of a given research. The specific format of an interview guide (a list of topics or a detailed list of general and additional interview questions) might depend on your style, experience, and comfort level as an interviewer or with your topic. If you are well-experienced in qualitative interviewing and you are highly familiar with the research topic, then it might be enough to prepare the interview guide as a list of potential topics and subtopics to be covered during the interview. But if you are less experienced in qualitative interviewing and/or you are not that familiar with the research topic, then it would be much better to prepare a detailed set of interview questions.
Begin constructing your interview guide by brainstorming all the topics and questions that come to mind when you think about your research question(s). Once you’ve got a pretty good list, you can pare it down by cutting questions and topics that seem redundant and group like questions and topics together. If you haven’t done so yet, you may also want to come up with question and topic headings for your grouped categories. You should also consult the scholarly literature to find out what kinds of questions other interviewers have asked in studies of similar topics.
Also, when preparing the interview guide, it is a good idea to define and separate the general questions from a few extra additional probing questions for each general question. You may use these probing questions if required, such as in instances when the respondent has difficulties answering a given general question, if he/she needs additional guidance, or if you want to dig deeper into the topic. Often, interviewers are left with a few questions that were not asked, which is why prioritizing the questions in a list of most important to least is good practice.
As with quantitative survey research, it is best not to place very sensitive or potentially controversial questions at the very beginning of your qualitative interview guide. You need to give participants the opportunity to become comfortable with the interview process and with you. Finally, get some feedback on your interview guide. Ask your colleagues for some guidance and suggestions once you’ve come up with what you think is a pretty strong guide. Chances are they’ll catch a few things you hadn’t noticed.
In terms of the questions you include on your interview guide, there are a few guidelines worth noting:
Avoid questions that can be answered with a simple yes or no, or if you do choose to include such questions, be sure to include follow-up questions. Remember, one of the benefits of qualitative interviews is that you can ask participants for more information—be sure to do so. While it is a good idea to ask follow-up questions, try to avoid asking “why” as your follow-up question, as this particular question can come off as confrontational, even if that is not how you intend it. Often people won’t know how to respond to “why,” perhaps because they don’t even know why themselves. Instead of “why,” say something like, “Could you tell me a little more about that?” This allows participants to explain themselves further without feeling that they’re being doubted or questioned in a hostile way.
Avoid phrasing your questions in a leading way. For example, rather than asking, “Don’t you think that most employees would rather have a four day work week?” you could ask, “What comes to mind for you when you hear that a four day work week is being considered at our company?”
Collecting and Storing Interview Information
Even after the interview guide is constructed, the interviewer is not yet ready to begin conducting interviews. The researcher next has to decide how to collect and maintain the information. It is probably most common for qualitative interviewers to take audio recordings of the interviews they conduct. Interviews, especially conducted on meeting platforms like Zoom can also now be captured using AI-assisted tools that will provide a video, transcription, and summary of the interview. For in person interviews, recordings can be captured by applications designed for this purpose, such as Otter.ai.
Recording interviews allows the researcher to focus on her or his interaction with the interview participant rather than being distracted by trying to take notes. Of course, not all participants will feel comfortable being recorded and sometimes even the interviewer may feel that the subject is so sensitive that recording would be inappropriate. If this is the case, it is up to the researcher to balance excellent note-taking with exceptional question asking and even better listening. Whether you will be recording your interviews or not (and especially if not), practicing the interview in advance is crucial. Ideally, you’ll find a friend or two willing to participate in a couple of trial runs with you. Even better, you’ll find a friend or two who are similar in at least some ways to your sample. They can give you the best feedback on your questions and your interviewing performance.
In large community projects, such as you would find in real estate developments, rezoning, urban renewal, as well as more local or internal innovation or change, it is always a good idea to involve stakeholders in conversations and consultations. How you design the consultation process will affect it's success, so be deliberate in your planning. Your stakeholder's input can be the most valuable asset to any researcher and can guide them in making decisions that will move projects towards the benefit of all affected. Using stakeholder mapping, ethical consultation practices, and effective question design skills will facilitate the process.
Additional Resources
For information on developing stakeholder consultation with indigenous communities, read Part I: Stakeholder Consultation. For a recent and local example of a stakeholder engagement plan, see the University of Victoria’s “Campus Greenway Engagement Plan (University of Victoria Campus Planning and Sustainability, n.d.). A significant step in this plan — a Design Charrette — was implemented in the fall of 2018; the results of that engagement activity, presented in a Summary Report (.pdf) (University of Victoria Campus Planning and Sustainability, 2018), and it resulted in changes and augmentation of the original plan based on stakeholder feedback.
The segment on Asking Survey and Interview Questions was partially adapted from Marko Divjak's Google Classroom unit (2007-2025) ASKING QUESTIONS IN SURVEYS AND INTERVIEWS | OER Commons
His work consisted of remixing and adapting the following two open educational resources, which were both licensed under CC BY-NC-SA license:
Bhattacherjee, A. (2012). Social Science Research: Principles, Methods, and Practices. University of South Florida: Scholar Commons (chapter 9). Retrieved from: https://scholarcommons.usf.edu/cgi/viewcontent.cgi?article=1002&context=oa_textbooks
Blackstone, A. (2014). Principles of sociological inquiry – Qualitative and quantitative methods (chapters 8 and 9). Retrieved from: http://www.saylor.org/site/textbooks/Principles%20of%20Sociological%20Inquiry.pdf
References
Association of Project Management. (2020). What is stakeholder engagement [Video]. Youtube. https://www.youtube.com/watch?v=ZzqvF9uJ1hA&t=1s
Babbie, E. (2010). The practice of social research (12th ed.). Belmont, CA: Wadsworth.
Bhattacherjee, A. (2012). Bhattacherjee, A. (2012). Social Science Research: Principles, Methods, and Practices. University of South Florida: Scholar Commons. Retrieved from: https://scholarcommons.usf.edu/cgi/viewcontent.cgi?article=1002&context=oa_textbooks
Clennan, R. (2007, April 24). Part 1: Stakeholder consultation. International Finance Corporation. PDF. https://documentcloud.adobe.com/link/track?uri=urn:aaid:scds:US:1e1092b2-09d5-47c1-b6bd-be2a96f46081.
Dillman, D. A. (2000). Mail and Internet surveys: The tailored design method (2nd ed.). New York, NY: Wiley.
Driscoll, C. & Starik, M. (2004). The primordial stakeholder: Advancing the conceptual consideration of stakeholder status for the natural environment. Journal of Business Ethics, 49(1), pp. 55-73. https://doi.org/10.1023/B:BUSI.0000013852.62017.0e
Hagan, M. (2017, August 28). Stakeholder mapping of traffic ticket system. Open Law Lab. Available: http://www.openlawlab.com/2017/08/28/stakeholder-mapping-the-traffic-ticket-system/ . CC-BY-NC-SA 4.0.
Hopper, J. (2010). How long should a survey be? Retrieved from http://www.verstaresearch.com/blog/how-long-should-a-survey-be
Last, S. (2019). Technical writing essentials. https://pressbooks.bccampus.ca/technicalwriting/
Microsoft. (2025, January 27). Drafting survey questions prompt assistance. Copilot.
Mungikar, S. (2018). Identifying stakeholders [Video]. Youtube. https://www.youtube.com/watch?v=8uZiGB8DeJg
Neuman, W. L. (2003). Social research methods: Qualitative and quantitative approaches (5th ed.). Boston, MA: Pearson.
Nikki. (2004) "Wicked." Urban Dictionary: WICKED
OHCHR. (2013). Free, prior and informed consent of
indigenous peoples. Microsoft Word - FPIC (final) Office of the High Commissioner of Human Rights.
Purdue Online Writing Lab (OWL). (n.d.). Creating good interview and survey questions. Purdue University. https://owl.purdue.edu/owl/research_and_citation/conducting_research/conducting_primary_research/interview_and_survey_questions.html
QuestionPro. (n.d.). 10 steps to a good survey design [Infographic]. Survey design. https://www.questionpro.com/features/survey-design/
United Nations. (2015). THE 17 GOALS | Sustainable Development
University of Victoria Campus Planning and Sustainability. (n.d.). Engagement plan for: The University of Victoria Grand Promenade landscape plan and design guidelines. Campus Greenway. University of Victoria. https://www.uvic.ca/campusplanning/current-projects/campusgreenway/index.php
University of Victoria Campus Planning and Sustainability. (2018). The Grand Promenade Design Charrette: Summary Report 11.2018," Campus Greenway. University of Victoria. https://www.uvic.ca/campusplanning/current-projects/campusgreenway/index.php
Primary research undertaken when embarking on any large scale project will most likely include “public engagement,” or stakeholder consultation. Public engagement is the broadest term used to describe the increasingly necessary process that companies, organizations, and governments must undertake to achieve a “social license to operate.” Stakeholder engagement includes humans in the decision making process and can range from simply informing the public about plans for a project, to engaging them in more consultative practices like getting input and feedback from various groups, and even to empowering key community stakeholders in the final decisions.
For projects that have social, economic, and environmental impacts, and especially for those that foster an Indigenous World View, respect for the Sustainable Development Goals, respect for the rights of Indigenous peoples, and a commitment to social change, stakeholder consultation is an increasingly critical part of the planning stage. Creating an understanding of how projects will affect a wide variety of stakeholders is beneficial for both the company initiating the project and the people who will be affected by it. Listening to stakeholder feedback and concerns can be helpful in identifying and mitigating risks that could otherwise slow down or even derail a project. It can also be an opportunity to build into the project values and actions that work towards improving environmental conditions as well as uplifting communities and the individuals who belong there. For stakeholders, the consultation process creates an opportunity to be informed, as well as to inform the company about local contexts that may not be obvious, to raise issues and concerns, and to help shape the objectives and outcomes of the project.
What is a Stakeholder?
Stakeholders include any individual or group who may have a direct or indirect “stake” in the project – anyone who can be affected by it, or who can have an effect on the actions or decisions of the company, organization, or government. They can also be people who are simply interested in the matter, but more often they are potential beneficiaries or risk-bearers. They can be internal – people from within the company or organization (owners, managers, employees, shareholders, volunteers, interns, students, etc.) – and external, such as community members or groups, investors, suppliers, consumers, policy makers, etc. Increasingly, arguments are being made for considering non-human stakeholders such as the natural environment (Driscoll & Starik, 2004). The following video, Identifying Stakeholders (2018) further explains the process of identifying stakeholders.
Stakeholders can contribute significantly to the decision-making and problem-solving processes. People most affected by the problem and most directly impacted by its effects can help you
- understand the context, issues and potential impacts,
- determine your focus, scope, and objectives for solutions, and
- establish whether further research is needed into the problem.
People who are attempting to solve the problem can help you
- refine, refocus, prioritize solution ideas,
- define necessary steps to achieving them, and
- implement solutions, provide key data, resources, etc.
There are also people who could help solve the problem, but lack awareness of the problem or their potential role. Consultation processes help create the awareness of the project to potentially get these people involved during its early stages.
Knowledge Check
Stakeholder Mapping
The more a stakeholder group will be materially affected by the proposed project, the more important it is for them to be identified, properly informed, and encouraged to participate in the consultation process. Determining who the various stakeholders are, as well as their level of interest in the project, the potential impact it will have on them, and the power they have to shape the process and outcome is critical. You might start by brainstorming or mind-mapping all the stakeholders you can think of. See Figure 7.4.1 as an example.

LLMs can be used for the mapping process and can reveal stakeholders that you would not think of on your own. To enlist the LLM for assistance follow this process:
Asking the LLM to Generate a Differentiated List of Stakeholders
You can adapt the following example to create a prompt that would assist in identifying stakeholders using a LLM. For the process to be effective, you will need to include specific information about the proposed project or issue, being mindful of your company's confidentiality policy.
The City of Toronto is thinking of implementing an automated traffic citation system. Before implementation, the City will be consulting with various community stakeholders, including small and large businesses, residents, local community groups, health care and social work agencies, and Indigenous groups. The City will also include in the consultation process law enforcement and potential vendors. Please help me to develop a categorized list of potential stakeholders for this project. I want to classify those who will offer strong support, who would be neutral, and those in opposition.
Once stakeholders who may be impacted have been identified, they can be organized into categories or a matrix. One standard method of organizing stakeholders is to determine which are likely to be in support of the project and which are likely to oppose it, and then determine how much power or influence each of those groups has (see Figure 7.4.2 for a visualization of such a matrix).

For example, a mayor of a community has a strong level of influence. If the mayor is in full support of the project, this stakeholder would go in the top right corner of the matrix. Someone who is deeply opposed to the project, but has little influence or power, would go at the bottom left corner.
A matrix like this can help you determine what level of engagement is warranted: where efforts to “consult and involve” might be most needed and most effective, or where more efforts to simply “inform” might be most useful. You might also consider the stakeholders’ level of knowledge on the issue, level of commitment (whether in support or opposed), and resources available.
As you proceed through the stakeholder mapping and especially when using an LLM to assist, be alert for the potential for bias and exclusion. The process is intended to draw as many relevant stakeholders into the consultation. In addition, there may be other factors that should be included in the matrix, so think beyond the boundaries of this example. The factors will be determined by the project and populations involved. Being aware of personal, group, and institutional bias along with contextual variables, for example, will help with keeping a more open mindset and account for a multitude of insights needed to proceed thoroughly.
Levels of Stakeholder Engagement
Levels of engagement can range from simply informing people about what you plan to do to actively seeking consent and placing the final decision in their hands. This range, presented in Figure 7.4.3, is typically presented as a “spectrum” or continuum of engagement from the least to most amount of engagement with stakeholders.

Review the following video, What is Stakeholder Engagement? (2020), for an additional overview of this process.
Depending on the type of project, the potential impacts and the types and needs of stakeholders, you may engage in a number of levels and strategies of engagement across this spectrum using a variety of different tools (see Table 7.4.1). Your approach may focus on one or several of these:
- Inform: Provide stakeholders with balanced and objective information to help them understand the project, the problem, and the solution alternatives. (There is no opportunity for stakeholder input or decision-making.)
- Consult: Gather feedback on the information given. The level of input can range from minimal interaction (online surveys, etc.) to extensive. It can be a one-time contribution or ongoing/iterative opportunities to give feedback to be considered in the decision-making process.
- Involve: Work directly with stakeholders during the process to ensure that their concerns and desired outcomes are fully understood and taken into account. Final decisions are still made by the consulting organization, but with well-considered input from stakeholders.
- Collaborate: Partner with stakeholders at each stage of decision-making, including developing alternative solution ideas and choosing the preferred solution together. The goal is to achieve consensus regarding decisions.
- Empower: Place final decision-making power in the hands of stakeholders. Voting ballots and referenda are common examples. This level of stakeholder engagement is rare and usually includes a small number of people who represent important stakeholder groups.
Table 7.4.1 Typical tools for public engagement
Inform | Consult | Involve / Collaborate / Empower |
---|---|---|
|
|
|
Knowledge Check
The Consultation Process: Basic Steps
There is no single “right” way of consulting with stakeholders. Each situation will be different so each consultation process will be context-specific and will require a detailed plan. A poorly planned consultation process can lead to a lack of trust between stakeholders and the company. Therefore, it is critical that the process be carefully mapped out in advance, and that preliminary work is done to determine the needs and goals of the process and the stakeholders involved. In particular, ensure that whatever tools you use are fully accessible to all the stakeholders you plan to consult. For example, an online survey is not of much use to a community that lacks robust Wi-Fi infrastructure. Consider the following steps to structure your consultation process:
- Situation Assessment: Who needs to be consulted about what and why? Define internal and external stakeholders, determine their level of involvement, interest level, and potential impact, their needs and conditions for effective engagement.
- Goal-setting: What is your strategic purpose for consulting with stakeholders at this phase of the project? Define clear understandable goals and objectives for the role of stakeholders in the decision-making process. Determine what questions, concerns, and goals the stakeholders will have and how these can be integrated into the process. Integrate applicable Sustainable Development Goals from the outset so they are not treated as an afterthought.
- Planning/Requirements: Based on situation assessment and goals, determine what engagement strategies to use and how to implement them to best achieve these goals. Ensure that strategies consider issues of accessibility and inclusivity and consider vulnerable populations. Consider legal or regulatory requirements, policies, or conditions that need to be met. For example, because the research will involve humans, the project should be approved by a Research Ethics Board prior to engagement with the public. (For more information on research involving human subjects, go to Chapter 7.5.) During this phase, you should also determine how you will collect, record, track, analyze and disseminate the data.
- Process and Event Management: In this phase, you will devise strategies to keep the planned activities moving forward and on-track, and adjust strategies as needed. Be sure to keep track of all documentation.
- Evaluation: Design an evaluation metric to gauge the success of the engagement strategies; collect, analyze, and act on the data collected throughout the process. Determine how will you report the results of the engagement process back to the stakeholders.
In situations when projects will affect Indigenous communities, land, and resources, you are obligated by the United Nations Office of the High Commissioner on Human Rights to apply the practice of "free, prior, and informed consent of indigenous peoples" (OHCHR, 2013).
Communicating Effectively in Stakeholder Engagement
Effective communication is the foundation of stakeholder consultation. The ability to create and distribute effective information, develop meaningful relationships, build trust, and listen to public input is essential.
The basic communication skills required for any successful stakeholder engagement project include:
- Effective Writing: the ability to create clear and concise written messages using plain language and structural conventions.
- Visual Rhetoric: the ability to combine words, images, and graphics to support, clarify, and illustrate ideas and make complex issues understandable to a general audience.
- Public Speaking/Presenting: the ability to present information to large audiences in a comfortable and understandable way. The ability to create effective visual information that increases the audience’s understanding.
- Interpersonal and Intercultural Skills: the ability to relate to people in face-to-face situations, to make them feel comfortable and secure, and to be mindful of cultural factors that may affect interest level, accessibility, impact, values, or opinions.
- Collaboration Skills: the ability to work effectively with little friction with team members. Collaboration typically involves frequent and open communication, cooperation, good will, information sharing, problem-solving, and empathy.
- Active Listening: the ability to focus on the speaker and portray the behaviours that provide them with the time and safety needed to be heard and understood. The ability to report back accurately and fully what you have heard from participants.
Note that when communicating with Indigenous Elders and Chiefs, specific protocols should be followed to honour the traditions, culture, and history of Indigenous people.
Caring for Your Stakeholders: Introductory Principles
Most often, stakeholder engagement and consultation will obviously involve the participation of humans since they are the ones who will be providing the input needed to move a project forward. Because in the past some has been conducted that inflicted harm on participants, strict guidelines have been developed for the ethical treatment of humans in research. When first embarking on a human-involved research project, you must first check-in with your professor or workplace manager, who will determine if an application for approval must be submitted to the Research Ethics Board (REB) at the institution or organization. This approval process helps to ensure that the research methods as well as data collection, storage, and use will all align with ethical practices and laws.
The basic guidelines in the chart below are typically part of the process of ensuring that you are transparent and careful with how you approach your research and human participants and subjects. Be reminded that when gathering, storing, and using data, especially that gathered from people, you must abide by the Canadian federal and provincial privacy laws. Please refer to the Personal Information Protection and Electronic Documents Act (PIPEDA) and Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA) for more information. See more about protecting people's information in Chapter 7.5 Research Ethics.
Recruiting Participants
When recruiting potential participants, you must give them the following information before you begin:
- Researcher name(s): Inform them of your name and contact information
- Affiliation: Provide a) the name of your institution or organization, b) your department, and c) your instructor's or manager's name and contact information
- Project Description: Include a brief description of the project in lay language that can be understood by the participants and clearly identifies the problem being addressed; mention the names of key people on your project team
- Purpose: Describe the purpose of your research or consultation/engagement project (objectives), and the benefits you hope will come from this project (overall goal). Your activity should not involve any deception (e.g., claiming to be gathering one kind of information, such as "do you prefer blue or green widgets?", but actually gathering another kind, such as "what percentage of the population is blue/green colour blind?"). See Chapter 7.5 for more on the topic of ethics violations.
- Use of information: Inform participants of the way in which the results will be presented and/or disseminated.
Informed Consent
You must gain the informed consent of the people you will be surveying, interviewing, or observing. This can be done using a consent form they can sign in person, or through an “implied consent” statement such as on an electronic survey. The consent form should include all the information in the “recruiting” section above. In addition, you should
- include a full description of all data collection methods, procedures, and instruments, as well as expectations regarding the amount of time required for participation
- inform participants that their participation is voluntary and that they may withdraw at any time without consequence, even if they have not completed the survey or interview,
- disclose any and all risks or discomfort that participation in the study may involve, and how these risks or discomfort will be addressed, and
- ensure that all participants are adults (19 years of age or older) and fully capable of giving consent; do not recruit from vulnerable or at-risk groups, and do not collect demographic data regarding age, gender, or any other information not relevant to the study (e.g., phone numbers, medications they are taking, whether they have a criminal records, etc.).
- include a method by which the participants can signal their consent or include an implied consent statement as as the following: "By completing and submitting this survey, you are indicating your consent for the collection and use of information provided."
Managing the Data
Participants should be told what will happen to the data you gather:
- Survey data are anonymous if you do not track who submitted responses. In anonymous surveys, let participants know that once they submit their survey, it cannot be retrieved and removed from the overall results.
- Inform participants of the means by which their anonymity will be protected and data will be kept confidential as well as how the raw data, including tapes, digital recordings, notes, and other types of data will be disposed of at the end of the project.
- Let survey participants know a) that your research results will be reported without their names and identifiers, b) where the data will be stored, c) how it will be "published", and d) what will happen to the raw data once your project is complete.
- Let interview participants know how their information will be used and if their names will be included or cited.
There may be additional issues that must addressed, such as accessibility and cultural considerations, but those listed above are typically included. If you are unsure whether a particular line of inquiry or method of data collection requires ethics approval, you should ask your instructor or manager. Most importantly, you should always be completely transparent and honest about what and how you are researching.
It may seem like “a lot of fuss” to go through simply to ask people whether they prefer blue widgets or green widgets, but there are important reasons for these guidelines. People participating in your research need to be reassured that you are doing this for a legitimate reason and that the information you are gathering will be treated with respect and care.
For more information about research ethics, see Chapter 7.5.
Asking Questions in Surveys and Interviews (adapted from Divjak, 2007-2025)
Questionnaire Surveys
Invented by Sir Francis Galton, a questionnaire is a research instrument consisting of a set of questions (items) intended to capture responses from respondents in a standardized manner. Questions should be designed in such a way that respondents are able to read, understand, and respond to them in a meaningful way. Surveys are commonly used to gather information from stakeholders because they are efficient in gathering lots of data, are of low cost, and, when designed well, can reveal valuable information.
Here are the steps involved applying a survey in your study:

Constructing a survey questionnaire is an art. Numerous decisions must be made about the content of questions, their wording, format, and sequencing, all of which can have important consequences for the survey responses. In the process, you must use the ethical principles discussed in Chapter 7.5 to guide you. Primarily, you would want to avoid doing anything that would cause physical or emotional harm to your participants. For example, carefully word sensitive or controversial questions and avoid inserting unintended bias or asking leading questions. You want to design questions to get meaningful and accurate responses rather than ambiguous information that is impossible to quantify or analyze. As a result, constructing a survey questionnaire is not a linear straightforward process. Instead, it is an iterative process, in which you would most probably need to produce several versions of the questionnaire and revise them a few times before coming up with a final version of your questionnaire.
Where to start? To write effective survey questions begin by identifying what you wish to know. In other words, refer to the research questions (general and specific) that have guided your research process to ensure that you collect all relevant data through the survey.
Let’s say you want to understand how students make a successful transition from high school to college. Perhaps you wish to identify which students are comparatively more or less successful in this transition and which factors contribute to students’ success or lack thereof. Let’s suppose you have set up the following general research question: Which factors contribute to students’ success or failure in the process of transition from high school to university? To understand which factors shape (un)successful students’ transitions to university, you’ll need to include questions in your survey about all the possible factors that could affect this transition. Consulting the literature on the topic will certainly help, but you should also take the time to do some brainstorming on your own and to talk with others about what they think may be important. Perhaps time or space limitations won’t allow you to include every single item you’ve come up with, so you’ll also need to rank your questions so that you can be sure to include those that are the most important.
LLMs can assist in drafting survey questions when they are provided with specific information about your project. Remember that the LLM output will often produce generalized information and content, so careful revision of the questions and revising them for your specific research goals is essential.
Asking the LLM to Draft Survey Questions (adapted from Microsoft Copilot, 2025)
Use the following key elements in your prompt when asking a LLM to draft survey questions:
- Purpose of the Survey: Clearly state the objective of the survey. This helps the LLM understand the context and generate relevant questions.
- Example: "Create survey questions that will help to assess the inclusivity of our workplace environment [specify the type of business or industry]."
- Target Audience: Specify who the survey is intended to address (respondents). This ensures the questions are appropriate for the respondents.
- Example: "The survey is for all employees across different departments and levels."
- Type of Questions: Indicate the types of questions you want (e.g., multiple choice, Likert scale, open-ended*).
- Example: "Include a mix of multiple choice, Likert scale, and open-ended questions."*Find out about question types in the sections below.
- Key Topics or Areas: List the main topics or areas you want to cover in the survey.
- Example: "Focus on experiences of inclusion, diversity, and equity in the workplace."
- Number of Questions: Specify the desired number of questions.
- Example: "Generate 12 questions."
- Tone and Language: Mention any specific tone or language preferences.
- Example: "Use a respectful and inclusive tone."
- Additional Instructions: Include any other specific instructions or constraints.
-
- Example: "Ensure questions are sensitive to diverse backgrounds and identities."
Here's a sample prompt incorporating these elements:
Prompt for LLM to Create Survey Questions
Help me to develop a survey to assess the inclusivity of our workplace environment. The survey is intended for all employees across different departments and levels. Include a mix of multiple choice, Likert scale, and open-ended questions. I want to focus on experiences of inclusion, diversity, and equity in the workplace. Guide me as I generate 12 questions using a respectful and inclusive tone. Ensure that I develop questions are sensitive to diverse backgrounds and identities.
By including these elements, you can provide clear and comprehensive instructions to LLMs, ensuring the generated survey questions are relevant and effective in assessing the topic area you are investigating.
Response Formats
Questions may be unstructured (open-ended) or structured (closed-ended). Unstructured questions ask respondents to provide a response in their own words, while structured questions ask respondents to select an answer from a given set of choices (response options).
Generally, structured (closed-ended) questions predominate in surveys but sometimes survey researchers include open-ended questions as a way to gather additional details from respondents. An open-ended question does not include response options; instead, respondents are asked to reply to the question using their own words. These questions are generally used to find out more about a survey participant’s experiences or feelings about the topic. If, for example, a survey includes closed-ended questions asking respondents to report on their involvement in extracurricular activities during college, an open-ended question could ask respondents why they participated in those activities or what they gained from their participation. While responses to such questions may also be captured using a closed-ended format, allowing participants to share some of their responses in their own words can make the experience of completing the survey more satisfying to them and can reveal motivations or explanations that had not occurred to the researcher.
There shouldn’t be too many unstructured (open-ended) questions included in surveys, as respondents might be reluctant to write down detailed, elaborated, and in-depth responses to such questions. Apart from that, there are other reasons behind this rationale (e.g., complexity of analyzing open-ended responses, especially in the case of large samples).
Designing Good Survey Questions
Responses obtained in survey research are very sensitive to the types of questions asked. Poorly framed or ambiguous questions will likely result in responses with very little value. Every question in a survey should be carefully scrutinized for the following issues:
Is the question clear and understandable?
Survey questions should be stated in clear and straight-forward language, preferably in active voice, and without complicated words or jargon that may not be understood by a typical respondent. The only exception is if your survey is targeted at a specialized group of respondents, such as doctors, lawyers and researchers, who use such jargon in their everyday environment. Make sure that the language you are using is concise and precise; avoid metaphoric expressions.
In the above example of the transition to college, the criterion of clarity would mean that respondents must understand what exactly you mean by “transition to college”. If you are going to use that phrase in your survey you need to clearly define what the phrase refers to.
Also, pay attention to the usage of words that may have different meanings, e.g., in different parts of the country or among different segments of survey respondents. Let’s take an example of the word “wicked”. This term could be associated with evil; however, it in contemporary subcultures, it means "very," "really," "extremely" (Nikki, 2004).
Is the question worded in a negative manner?
Negatively-worded questions should be avoided, and in all cases, avoid double-negatives. Such questions as "Should your local government not raise taxes?" and “Did you not drink during your first semester of college?” tend to confuse many respondents and lead to inaccurate responses. Instead, you could ask: "What is your opinion on your local government intention to raise taxes?" and "Did you drink alcohol in your first semester of college?" These examples are obvious, but hopefully clearly illustrate the importance of careful wording for questions.
Is the question ambiguous?
Survey questions should not contain words or expressions that may be interpreted differently by different respondents (e.g., words like “any” or “just”). For instance, if you ask a respondent "What is your annual income?", it is unclear whether you referring to salary/wages, or also dividend, rental, and other income, whether you referring to personal income, family income (including spouse’s wages), or personal and business income? Different interpretation by various respondents will lead to incomparable responses that cannot be interpreted correctly.
Does the question have biased or value-laden words?
Bias refers to any property of a question that encourages subjects to answer in a certain way. Kenneth Rasinky (1989, cited in Bhattacherjee, 2012) examined several studies on people’s attitude toward government spending, and observed that respondents tend to indicate stronger support for “assistance to the poor” and less for “welfare”, even though both terms had the same meaning. In this study, more support was also observed for “halting rising crime rate” (and less for “law enforcement”), “solving problems of big cities” (and less for “assistance to big cities”), and “dealing with drug addiction” (and less for “drug rehabilitation”). A biased language or tone tends to skew observed responses. It is often difficult to anticipate in advance the biasing wording, but to the greatest extent possible, survey questions should be carefully scrutinized to avoid biased language.
Is the question on two topics?
Avoid creating questions that are on two topics because the respondent can only respond to one of the topics at a time. For example, "Are you satisfied with the hardware and software provided for your work?" In this example, how should a respondent answer if he/she is satisfied with the hardware but not with the software or vice versa? It is always advisable to separate two-topic questions into separate questions: (1) "Are you satisfied with the hardware provided for your work?" and (2) "Are you satisfied with the software provided for your work?" Another example of a confusing question: "Does your family favour public television?" Some people in the family may favour public TV for themselves, but favour certain cable TV programs such as Sesame Street for their children. Ensure that your questions clearly address one topic.
Is the question too general?
Sometimes, questions that are too general may not accurately convey respondents’ perceptions. If you asked someone how they liked a certain book and provide a response scale ranging from “not at all” to “extremely well”, if that person selected “extremely well”, what does that mean? Instead, ask more specific behavioural questions, such as "Will you recommend this book to others?" or "Do you plan to read other books by the same author?" Likewise, instead of asking "How big is your firm?" (which may be interpreted differently by respondents), ask "How many people work for your firm, and/or what is the annual revenues of your firm?" which are both measures of firm size.
Is the question asking for too much detail?
Avoid asking for detail that serves no specific research purpose. For instance, do you need to have the age of each child in a household or is just the number of children in the household acceptable? If unsure, however, it is better to err on the side of details than generality.
Is the question leading or presumptuous?
If you ask, "What do you see are the benefits of a tax cut?" you are presuming that the respondent sees the tax cut as beneficial. But many people may not view tax cuts as being beneficial because tax cuts generally lead to lesser funding for public schools, larger class sizes, and fewer public services such as police, ambulance, and fire service. So avoid leading or presumptuous questions. Instead, first ask about the perception of the tax cut and then direct the question on the benefits of the tax cut only to those respondents who perceive tax cut as beneficial (see the section on filter questions below).
Is the question relevant only for particular segments of respondents?
If you decide to pose questions about matters with which only a portion of respondents will have had experience, introduce a filter question into your survey. A filter question is designed to identify a subset of survey respondents who are asked additional questions that are not relevant to the entire sample. Perhaps in your survey on the transition to college you want to know whether substance use plays any role in students’ transitions. You may ask students how often they drank alcohol during their first semester of college. But this assumes that all students drank. Certainly, some may have abstained, and it wouldn’t make any sense to ask the nondrinkers how often they drank. So the filter would direct those who drank to additional questions, while other respondents proceed through the new question.
Is the question on a hypothetical situation?
A popular question in many television game shows is “If you won a million dollars on this show, how do you plan to spend it?” Most respondents have never been faced with such an amount of money and have never thought about it, and so their answers tend to be quite random, such as take a tour around the world, buy a restaurant, spend on education, save for retirement, help parents or children, or have a lavish wedding. Hypothetical questions have imaginary answers, which cannot be used for making scientific inferences.
Do respondents have the information needed to correctly answer the question?
Often, we assume that subjects have the necessary information to answer a question, when in reality, they do not. Even if responses are obtained, in such case, the responses tend to be inaccurate, given the respondents' lack of knowledge about the question being asked. For instance, we should not ask the CEO of a company about day-to-day operational details about which they may have no knowledge.
In the example of the transition to college, the respondents must have actually experienced the transition to college themselves in order for them to be able to answer the survey questions.
Does the question tend to elicit socially desirable answers?
In survey research, social desirability refers to the idea that respondents will try to answer questions in a way that will present them in a favourable light. Perhaps we decide that to understand the transition to college, we need to know whether respondents ever cheated on an exam in high school or college. We all know that cheating on exams is generally frowned upon. So, it may be difficult to get people to admit to such cheating on a survey. But if you can guarantee respondents’ confidentiality, or even better, their anonymity, chances are much better that they will be honest about having engaged in this socially undesirable behaviour. Another way to avoid problems of social desirability is to try to phrase difficult questions in the most benign way possible. Earl Babbie (2010) offers a useful suggestion for helping you do this—simply imagine how you would feel responding to your survey questions. If you would be uncomfortable, chances are others would as well.
In sum, in order to pose effective and proper survey questions, you should do the following:
- Identify what it is you wish to find out (define proper research questions).
- Keep questions clear and simple.
- Make questions relevant to respondents.
- Use filter questions when necessary.
- Avoid questions that are likely to confuse respondents such as those that use double negatives, use culturally specific terms, or pose more than one question in the form of a single question.
- Imagine how you would feel responding to questions.
Get feedback, especially from the people who resemble those in your sample (small scale pilot study), before you send it to your respondents.
Response Options
In the case of structured (closed-ended) questions you should carefully consider the response options. Response options are the answers that you provide to the people taking your survey, and they are usually captured using one of the following response formats:
- Dichotomous response, where respondents are asked to select one of two possible choices, such as true/false, yes/no, or agree/disagree. An example of such a question is: Do you think that the death penalty is justified under some circumstances? (circle one): yes / no.
- Nominal response, where respondents are presented with more than two unordered options, such as: What is your industry of employment?— manufacturing / consumer services / retail / education / healthcare / tourism & hospitality / other
- Ordinal response, where respondents have more than two ordered options, such as: What is your highest level of education?—high school / college degree / graduate studies.
- Interval-level response, where respondents are presented with a 5-point or 7-point interval scale, such as: To what extent do you agree with the following statement [insert statement]: strongly disagree / disagree / neither agree nor disagree / agree / strongly agree.
- Continuous response, where respondents enter a continuous (ratio-scaled) value with a meaningful zero point, such as their age, weight or year of birth (How old are you? What is your weight in kilograms? Which year you were born?). These responses generally tend to be of the fill-in-the blanks type.
- Rank ordering the response options, such as "Rank the five elements that influenced your choice of the university program from the most to the least influential; assign number 1 to the most influential element and number 5 to the least influential element; properly assign numbers 2-4 to the rest of the elements."
Generally, respondents will be asked to choose a single (or best) response to each question you pose, though certainly it makes sense in some cases to instruct respondents to choose multiple response options. One caution to keep in mind when accepting multiple responses to a single question, however, is that doing so may add complexity when it comes to analysing your survey results. Nevertheless, for each closed-ended question, clearly instruct the respondents on the number of response options they are required to choose.
Guidelines for Designing Response Options
Here are a few guidelines worth following when designing the response options.
Ensure that your response options are mutually exclusive. In other words, there should be no overlapping categories in the response options. In the example about the frequency of alcohol consumption, if we ask “On average, how many times per week did you consume alcoholic beverages during your first semester of college?”, we may then provide the following response options: a) less than one time per week, b) 1-2, c) 3-4, d) 5-6, e) 7+. Do you notice that there are no overlapping categories in the response options for these questions?
Create response options that are exhaustive. In other words, every possible response should be covered in the set of response options. In the example shown in the paragraph above, we have covered all possibilities: those who drank, say, an average of once per month can choose the first response option (“less than one time per week”) while those who drank multiple times a day each day of the week can choose the last response option (“7+”). All the possibilities in between these two extremes are covered by the middle three response options. When you are unsure about capturing all response options, you can add the option “other” to the list to enable the respondents to specify their response in their own words. This is particularly useful in case of nominal unorder response options.
If there is a reason to believe that not all respondents would be able to select from the given response options, then add the “not able to answer” or “not applicable to me” response to the response options list, in order to ensure the validity of collected data.
Use a question matrix when answer categories are identical. Using a matrix is a nice way of streamlining response options. A matrix is a question type that lists a set of questions for which the answer categories are all the same (Table 7.4.2). If you have a set of questions for which the response options are all the same, create a matrix rather than posing each question and its response options individually. Not only will this save you some space in your survey but it will also help respondents progress through your survey more easily and quickly.
Table 7.4.2 Sample Question Matrix
Online learning… | Strongly disagree | Disagree | Nether disagree nor agree | Agree | Strongly agree |
…is more difficult than traditional in-class learning. | 1 | 2 | 3 | 4 | 5 |
…only appropriate for part-time employed students. | 1 | 2 | 3 | 4 | 5 |
…will become predominant mode of learning in the near future. | 1 | 2 | 3 | 4 | 5 |
Ensure that response options are aligned with the question wording and vice versa. For example, if you have a yes/no question type, then you should only provide the yes and no response options (e.g., “Do you support the Prime Minister in his endeavours to foster respectful communication?”--yes/no). If you want to know about the frequency of an event (let’s say the frequency of alcohol consumption during college), then the yes/no question (e.g., “Did you consume alcohol during college?”) would be inappropriate. It would also be inappropriate to have a yes/no question type and ordered response options (e.g. every day in a week, once per week etc.). This is where attention to detail and proofreading skills come into play.
Questionnaire Design
In addition to constructing quality questions and posing clear response options, you’ll also need to think about how to present your written questions and response options to survey respondents. Designing questionnaires takes some thought, and in this section, we’ll discuss what you should think about as you prepare to present your well-constructed survey questions on a questionnaire.
In general, questions should flow logically from one to the next. To achieve the best response rates, questions should flow from the least sensitive to the most sensitive, from the factual and behavioural to the attitudinal, and from the more general to the more specific.
One of the first things to do once you’ve come up with a good set of survey questions is to group those questions thematically. In our example of the transition to college, perhaps we’d have a few questions asking about study habits, others focused on friendships, and still others on exercise and eating habits. Those may be the themes around which we organize our questions. Or perhaps it would make more sense to present any questions we had about precollege life and habits and then present a series of questions about life after beginning college. The point here is to be deliberate about how you present your questions to respondents.
Once you have grouped similar questions together, you’ll need to think about the order in which to present those question groups. Most survey researchers agree that it is best to begin a survey with questions that will want to make respondents continue (Babbie, 2010; Dillman, 2000; Neuman, 2003). In other words, don’t bore respondents, but don’t scare them away either.
There’s some disagreement over where on a survey to place demographic questions such as those about a person’s age, gender, and race. On the one hand, placing them at the beginning of the questionnaire may lead respondents to mistake the purpose of the survey. On the other hand, if your survey deals with some very sensitive or difficult topic, such as workplace racism or sexual harassment, you don’t want to scare respondents away or shock them by beginning with your most intrusive questions. Generally, it's advisable to put demographic questions in the end of the questionnaire, unless it is required to have them at the beginning, for instance if particular survey questions serve as filter questions.
In truth, the order in which you present questions on a survey is best determined by the unique characteristics of your research—only you, the researcher, and in consultation with colleagues, can determine how best to order your questions. To do so, think about the unique characteristics of your topic, your questions, and most importantly, your sample. Keeping in mind the characteristics and needs of the potential respondents should help guide you as you determine the most appropriate order in which to present your questions.
Also consider the time it will take respondents to complete your questionnaire. Surveys vary in length, from just a page or two to a dozen or more pages, which means they also vary in the time it takes to complete them. How long to make your survey depends on several factors. First, what is it that you wish to know? Wanting to understand how grades vary by gender and year in school certainly requires fewer questions than wanting to know how people’s experiences in college are shaped by demographic characteristics, college attended, housing situation, family background, college major, friendship networks, and extracurricular activities. Even if your research question requires a good number of survey questions, do your best to keep the questionnaire as brief as possible. Any hint that you’ve thrown in unnecessary questions will turn off respondents and may make them not want to complete your survey.
Second, and perhaps more important, consider the amount of time respondents would likely be willing to spend completing your questionnaire? If you are studying college students, your survey would be taking up valuable study time, so they won’t want to spend more than a few minutes on it. The time that survey researchers ask respondents to spend on questionnaires varies greatly. Some advice that surveys should not take longer than about 15 minutes to complete (cited in Babbie, 2010), others suggest that up to 20 minutes is acceptable (Hopper, 2010). This would be applicable to self-completion surveys, while surveys administered personally face-to-face by an interviewer asking questions, may even take up to an hour or more. As with question order, there is no clear-cut rule for how long a survey should take to complete. The unique characteristics of your study and your sample should be considered in order to determine how long to make your questionnaire.
A good way to estimate the time it will take respondents to complete your questionnaire is through pretesting (piloting). Pretesting allows you to get feedback on your questionnaire so you can improve it before you actually administer it. Pretesting is usually done on a small number of people (e.g., 5 to 10) who resemble your intended sample and to whom you have easy access. By pretesting your questionnaire, you can find out how understandable your questions are, get feedback on question wording and order, find out whether any of your questions are unclear, overreach, or offend, and learn whether there are places where you should have included filter questions, to name just a few of the benefits of pretesting. You can also time pre-testers as they take your survey. Ask them to complete the survey as though they were actually members of your sample. This will give you a good idea about what sort of time estimate to provide respondents when it is administered, and whether you are able to add additional questions/items or need to cut a few questions/items.
Perhaps this goes without saying, but your questionnaire should also be attractive. A messy presentation style can confuse respondents or, at the very least, annoy them. Use survey design tools like Google Forms, Microsoft Forms to create a consistent design and to automate the collection and visualization of the data collected.
In sum, here are some general rules regarding question sequencing and questionnaire design:
- Brainstorming and consulting your research to date are two important early steps to take when preparing to write effective survey questions.
- Start with easy non-threatening questions that can be easily recalled.
- Never start with an open-ended question.
- If following an historical sequence of events, follow a chronological order from earliest to latest.
- Ask about one topic at a time (group the questions meaningfully around common topics). When switching topics, use a transition, such as “The next section examines your opinions about …”.
- Make sure that your survey questions will be relevant to all respondents and use filter or contingency questions as needed, such as: “If you answered “yes” to question 5, please proceed to Section 2. If you answered “no” go to Section 3.” Automated surveys will guide respondents to the next question based on the filters provided.
Always pretest your questionnaire before administering it to respondents in a field setting. Such pretesting may uncover ambiguity, lack of clarity, or biases in question wording, which should be eliminated before administering to the intended sample.
Qualitative Interviews
Qualitative interviews are sometimes called intensive or in-depth interviews. These interviews are semi-structured; the researcher has a particular topic about which he or she would like to hear from the respondent, but questions are open-ended and may not be asked in exactly the same way or in exactly the same order to each respondent. In in-depth interviews, the primary aim is to hear from respondents about what they think is important about the topic at hand and to hear it in their own words. In this section, we’ll take a look at how to conduct interviews that are specifically qualitative in nature, and use some of the strengths and weaknesses of this method.
Qualitative interviews might feel more like a conversation than an interview to respondents, but the researcher is in fact usually guiding the conversation with the goal of gathering information from a respondent. A key difference between qualitative and quantitative interviewing is that qualitative interviews contain open-ended questions without response options. Open-ended questions are more demanding of participants than closed-ended questions, for they require participants to come up with their own words, phrases, or sentences to respond. In qualitative interviews you should avoid asking closed-ended questions, such as yes/no questions or questions which are possible to be answered with short replies (e.g., in a couple of words or phrases).
In a qualitative interview, the researcher usually develops a guide in advance that he or she then refers to during the interview (or memorizes in advance of the interview). An interview guide is a list of topics or questions that the interviewer hopes to cover during the course of an interview. It is called a guide because any good interview is a conversation, and one topic can lead to another that is not on your list. Interview guides should outline issues that a researcher feels are likely to be important, but because participants are asked to provide answers in their own words, and to raise points that they believe are important, each interview is likely to flow a little differently. While the opening question in an in-depth interview may be the same across all interviews, from that point on what the participant says will shape how the interview proceeds. Here, excellent listening skills, knowledge of your goals, and an intuitive approach work together to obtain the information you need from the interviewee. It takes a skilled interviewer to be able to ask questions; listen to respondents; and pick up on cues about when to follow up, when to move on, and when to simply let the participant speak without guidance or interruption.
Like in case of a questionnaire surveys, the interview guide should be derived from the research questions of a given research. The specific format of an interview guide (a list of topics or a detailed list of general and additional interview questions) might depend on your style, experience, and comfort level as an interviewer or with your topic. If you are well-experienced in qualitative interviewing and you are highly familiar with the research topic, then it might be enough to prepare the interview guide as a list of potential topics and subtopics to be covered during the interview. But if you are less experienced in qualitative interviewing and/or you are not that familiar with the research topic, then it would be much better to prepare a detailed set of interview questions.
Begin constructing your interview guide by brainstorming all the topics and questions that come to mind when you think about your research question(s). Once you’ve got a pretty good list, you can pare it down by cutting questions and topics that seem redundant and group like questions and topics together. If you haven’t done so yet, you may also want to come up with question and topic headings for your grouped categories. You should also consult the scholarly literature to find out what kinds of questions other interviewers have asked in studies of similar topics.
Also, when preparing the interview guide, it is a good idea to define and separate the general questions from a few extra additional probing questions for each general question. You may use these probing questions if required, such as in instances when the respondent has difficulties answering a given general question, if he/she needs additional guidance, or if you want to dig deeper into the topic. Often, interviewers are left with a few questions that were not asked, which is why prioritizing the questions in a list of most important to least is good practice.
As with quantitative survey research, it is best not to place very sensitive or potentially controversial questions at the very beginning of your qualitative interview guide. You need to give participants the opportunity to become comfortable with the interview process and with you. Finally, get some feedback on your interview guide. Ask your colleagues for some guidance and suggestions once you’ve come up with what you think is a pretty strong guide. Chances are they’ll catch a few things you hadn’t noticed.
In terms of the questions you include on your interview guide, there are a few guidelines worth noting:
Avoid questions that can be answered with a simple yes or no, or if you do choose to include such questions, be sure to include follow-up questions. Remember, one of the benefits of qualitative interviews is that you can ask participants for more information—be sure to do so. While it is a good idea to ask follow-up questions, try to avoid asking “why” as your follow-up question, as this particular question can come off as confrontational, even if that is not how you intend it. Often people won’t know how to respond to “why,” perhaps because they don’t even know why themselves. Instead of “why,” say something like, “Could you tell me a little more about that?” This allows participants to explain themselves further without feeling that they’re being doubted or questioned in a hostile way.
Avoid phrasing your questions in a leading way. For example, rather than asking, “Don’t you think that most employees would rather have a four day work week?” you could ask, “What comes to mind for you when you hear that a four day work week is being considered at our company?”
Collecting and Storing Interview Information
Even after the interview guide is constructed, the interviewer is not yet ready to begin conducting interviews. The researcher next has to decide how to collect and maintain the information. It is probably most common for qualitative interviewers to take audio recordings of the interviews they conduct. Interviews, especially conducted on meeting platforms like Zoom can also now be captured using AI-assisted tools that will provide a video, transcription, and summary of the interview. For in person interviews, recordings can be captured by applications designed for this purpose, such as Otter.ai.
Recording interviews allows the researcher to focus on her or his interaction with the interview participant rather than being distracted by trying to take notes. Of course, not all participants will feel comfortable being recorded and sometimes even the interviewer may feel that the subject is so sensitive that recording would be inappropriate. If this is the case, it is up to the researcher to balance excellent note-taking with exceptional question asking and even better listening. Whether you will be recording your interviews or not (and especially if not), practicing the interview in advance is crucial. Ideally, you’ll find a friend or two willing to participate in a couple of trial runs with you. Even better, you’ll find a friend or two who are similar in at least some ways to your sample. They can give you the best feedback on your questions and your interviewing performance.
In large community projects, such as you would find in real estate developments, rezoning, urban renewal, as well as more local or internal innovation or change, it is always a good idea to involve stakeholders in conversations and consultations. How you design the consultation process will affect it's success, so be deliberate in your planning. Your stakeholder's input can be the most valuable asset to any researcher and can guide them in making decisions that will move projects towards the benefit of all affected. Using stakeholder mapping, ethical consultation practices, and effective question design skills will facilitate the process.
Additional Resources
For information on developing stakeholder consultation with indigenous communities, read Part I: Stakeholder Consultation. For a recent and local example of a stakeholder engagement plan, see the University of Victoria’s “Campus Greenway Engagement Plan (University of Victoria Campus Planning and Sustainability, n.d.). A significant step in this plan — a Design Charrette — was implemented in the fall of 2018; the results of that engagement activity, presented in a Summary Report (.pdf) (University of Victoria Campus Planning and Sustainability, 2018), and it resulted in changes and augmentation of the original plan based on stakeholder feedback.
The segment on Asking Survey and Interview Questions was partially adapted from Marko Divjak's Google Classroom unit (2007-2025) ASKING QUESTIONS IN SURVEYS AND INTERVIEWS | OER Commons
His work consisted of remixing and adapting the following two open educational resources, which were both licensed under CC BY-NC-SA license:
Bhattacherjee, A. (2012). Social Science Research: Principles, Methods, and Practices. University of South Florida: Scholar Commons (chapter 9). Retrieved from: https://scholarcommons.usf.edu/cgi/viewcontent.cgi?article=1002&context=oa_textbooks
Blackstone, A. (2014). Principles of sociological inquiry – Qualitative and quantitative methods (chapters 8 and 9). Retrieved from: http://www.saylor.org/site/textbooks/Principles%20of%20Sociological%20Inquiry.pdf
References
Association of Project Management. (2020). What is stakeholder engagement [Video]. Youtube. https://www.youtube.com/watch?v=ZzqvF9uJ1hA&t=1s
Babbie, E. (2010). The practice of social research (12th ed.). Belmont, CA: Wadsworth.
Bhattacherjee, A. (2012). Bhattacherjee, A. (2012). Social Science Research: Principles, Methods, and Practices. University of South Florida: Scholar Commons. Retrieved from: https://scholarcommons.usf.edu/cgi/viewcontent.cgi?article=1002&context=oa_textbooks
Clennan, R. (2007, April 24). Part 1: Stakeholder consultation. International Finance Corporation. PDF. https://documentcloud.adobe.com/link/track?uri=urn:aaid:scds:US:1e1092b2-09d5-47c1-b6bd-be2a96f46081.
Dillman, D. A. (2000). Mail and Internet surveys: The tailored design method (2nd ed.). New York, NY: Wiley.
Driscoll, C. & Starik, M. (2004). The primordial stakeholder: Advancing the conceptual consideration of stakeholder status for the natural environment. Journal of Business Ethics, 49(1), pp. 55-73. https://doi.org/10.1023/B:BUSI.0000013852.62017.0e
Hagan, M. (2017, August 28). Stakeholder mapping of traffic ticket system. Open Law Lab. Available: http://www.openlawlab.com/2017/08/28/stakeholder-mapping-the-traffic-ticket-system/ . CC-BY-NC-SA 4.0.
Hopper, J. (2010). How long should a survey be? Retrieved from http://www.verstaresearch.com/blog/how-long-should-a-survey-be
Last, S. (2019). Technical writing essentials. https://pressbooks.bccampus.ca/technicalwriting/
Microsoft. (2025, January 27). Drafting survey questions prompt assistance. Copilot.
Mungikar, S. (2018). Identifying stakeholders [Video]. Youtube. https://www.youtube.com/watch?v=8uZiGB8DeJg
Neuman, W. L. (2003). Social research methods: Qualitative and quantitative approaches (5th ed.). Boston, MA: Pearson.
Nikki. (2004) "Wicked." Urban Dictionary: WICKED
OHCHR. (2013). Free, prior and informed consent of
indigenous peoples. Microsoft Word - FPIC (final) Office of the High Commissioner of Human Rights.
Purdue Online Writing Lab (OWL). (n.d.). Creating good interview and survey questions. Purdue University. https://owl.purdue.edu/owl/research_and_citation/conducting_research/conducting_primary_research/interview_and_survey_questions.html
QuestionPro. (n.d.). 10 steps to a good survey design [Infographic]. Survey design. https://www.questionpro.com/features/survey-design/
United Nations. (2015). THE 17 GOALS | Sustainable Development
University of Victoria Campus Planning and Sustainability. (n.d.). Engagement plan for: The University of Victoria Grand Promenade landscape plan and design guidelines. Campus Greenway. University of Victoria. https://www.uvic.ca/campusplanning/current-projects/campusgreenway/index.php
University of Victoria Campus Planning and Sustainability. (2018). The Grand Promenade Design Charrette: Summary Report 11.2018," Campus Greenway. University of Victoria. https://www.uvic.ca/campusplanning/current-projects/campusgreenway/index.php