Friday 30 November 2012

Ethics...oh dear

So my original research question really had nothing to do with people, well, people who were recently alive. Originally I was going to do research on the posthumus fame of artist - which was inspired by the Frieda and Diego exhibit at the AGO (which you should all go check out!). It was going to be, for the most part, content analysis based.

However, we are a long ways away from SSHRC and things have changed. Instead, I am interested in the information seeking behaviour of children. Which stems from the realization that young patrons (7-12ish) tend to not want to you computer based resources for class assignments. So there we have it, my proposal will now not only include live people, but live, children...people.

There is a lot of responsibility involved in research, and even more so when the subject is a person. It involves building relationships that are transparent - and the foundations of these would be trust and disclosure. Knight summarizes how build this relationship (pp 170) (distilling the information that was referred to in reading and class) by providing a list that I will most definitely refer to as I write this daunting proposal.  

Tuesday 27 November 2012

Being grateful for research subjects

Like Michelle said in her blog post, Knight's points on having a backup plan for when research does not go as planned are extremely helpful. It's very easy, especially in the proposal stage, to lean on the idea that the research will go completely as planned or that at the very least, enough data will be collected to justify the study in the first place. The reality is that people drop out of studies, data can be mishandled, and in general anything can go wrong! Our guest speaker this week reminded us that no one owes a researcher their time, and I think this mentality is important for planning a research project that deals with human subjects. It is easy to view our subjects as potential data warehouses, but in reality they are people with lives and more pressing matters than your project (sadly!). I think it's important to keep in mind that studies would not exist without the help of co-operative research subjects. Thinking of research subjects in this way has also helped me to write out the "ethics" portion of my proposal, as it caused me to put a lot of thought into how to justify asking students to participate in my study. Seeing a more "human" side to research has ultimately has made me a better potential researcher!

Monday 26 November 2012

Plan B's and interviewing children

The readings this week were all helpful and applicable to us as we continue to work on our proposals. In the Knight reading, I found it interesting to learn that it is important to have a “plan B” for when your research project could go wrong. He talks about some potential problems that could occur, from participants withdrawing from your study to losing data (Knight, 162-163). This is why it is always important to back-up your computer, which is something I need to get in the habit of doing regularly!

Another point of interest which is related to my own research project was in the Heath et al article. I discovered that even though children may have been given a choice of whether to participate or not, the issue is that they may have said “yes” to the study because they are trying to please or are afraid of the consequences of not being seen as “cooperative.” (Heath et al, 414) As part of my research methodology involves interviewing children this was very relevant and something I had not considered before. Additionally the idea that children are “eager participants” is a faulty perception. Not only do we have to ensure the participants have a “voice,” but Heath et al. go further by suggesting that researchers should also be respectful of the rights of children to remain silent in discussions. (Heath et al, 415) All of these are major implications for anyone doing research with children.

Research Proposal

Hi everyone,
I was freaking out just a little bit over the research proposal assignment, so I googled sample research proposals and found a couple that seem to correspond to our guidelines in terms of length and sections. Feel free to forward these to other members of our class - I can't seem to post on blackboard...

This is a master's proposal specifically:

https://www.google.ca/search?q=sample+research+proposal+focus+groups&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a

This is another sample:

http://www.scribd.com/doc/17320747/Sample-Research-Proposal

Good luck!

Polina

Research Ethics


This week we looked at the ethics involved in doing research with human subjects. I personally have never done research with living subjects – every subject I’ve looked previously in school in a research project has been dead for 100 years. I found Knight’s list about building relationships coincided with the University of Toronto’s Guidelines and Practices Manual For Research Involving Human Research. I argue this because Chapter 2 in Manual outlines the Principles of Research Ethics, which includes: respect for human dignity, free and informed consent, respect for vulnerable persons, respect for privacy and confidentiality, respect for justice and inclusiveness, balancing harms and benefits, minimizing harms, and maximizing benefits. After reading Knight’s list, it is clear that to build a good relationship with the research subjects, research ethics can help do this. For example, free and informed consent in the manual is like the first point on Knight’s list “Getting potential participants’ informed consent by telling them clearly what the research is about” (p. 170). Thus, maybe there is not such a fine line between the guidelines for research ethics, and building a good relationship with the participants of the study.

Views of Cell Antennas - Urban and ICT Infrastructure Perspectives


As the potential of mobile technology begins to be realized, a frequently ignored factor of infrastructure are the broadcast antennas. The proliferation of these antennas in the urban environment has, in some municipalities, challenged our understanding and acceptance of visible infrastructure and aesthetics of the urban fabric. Standard cell antennas are tall, narrow, grey blocks (http://www.mbs.ie/images/antenna3.jpg). Owing to the corporate structure of the wireless industry means a duplication of the physical infrastructure as each competing company builds overlapping networks. This has lead to buildings bristling with antennas. (Seen next link). Such obvious infrastructure, in some municipalities, can, unaltered, fade into the background, while in others deliberate steps are taken camouflage the antennas. While looking for photos to illustrate this point, I found a popular article on this subject (http://www.theatlanticcities.com/design/2012/07/dont-hide-cell-phone-towers-embrace-them/2437/). An interesting question is why this is necessary? Star’s model of infrastructure may provide important insights into understanding how things become infrastructure. And therefore, help explain why certain communities need and extra push to see this ICT infrastructure as also part of their urban infrastructure.

Twitter!


Michelle, Laura,  and I chose Twitter as a good subject for a case study because it encourages communication with the world at large concerning opinions, ideas, or even news updates. Twitter allows people to express themselves in “tweets” that are 140 characters or less. Users can mention each other in tweets or use hashtags to emphasize the subject matter of the tweet.
Hashtags make the collection and organization of data less complicated and often trending topics will give a researcher a good idea of what the “Twitterverse” finds particularly important. Twitter also allows for hyperlinking, retweeting, and replies to other Tweets. 
Many businesses/celebrities have Twitter accounts to increase or promote their brand. The number of followers a given user has will indicate their popularity, and who a user follows will indicate what their interests are. Followers/Following can be a useful resource in analyzing a person’s commercial interests.
Twitter poses a challenge because of its character limit. There is a danger that a tweet may be misconstrued because of its stipulated length. The fact that some Twitter accounts are set to private, and therefore inaccessible to the public, is also a challenge. 

Sunday 25 November 2012

Mini Online Research Assignment: Emotional Information Seeking, Tumblr, and Aesthetics

I've chosen tumblr as a potential emotional/affective space for study because of the ease with which tumblr users can customize their blogs to create a particular aesthetic/mood, and the way this aesthetic is often represented by commercial products.

What is it? Tumblr is a highly customizable microblogging platform and social networking site that allows users to blog (and reblog) photos, music, links, quotes, videos, and personal writing.

Who is it tailored to? Tumblr has 82.2 million blogs. It is popular for "curation," where users put together collections of images and other media that create a particular feel, mood, or aesthetic, or fit a particular theme or subject. Users often keep tumblr blogs that they treat much like other blogging platforms such as blogger or wordpress, writing personal text posts and including images from their own life - but the nature of tumblr as a microblogging platform makes it easy for users to quickly dump brief, immediate thoughts on their blog, much like twitter. Businesses and companies often have tumblrs, where they both promote their products and interact with the greater community. For example, the publishing house WW Norton & Company has a tumblr where they not only promote their books but discuss and share all kinds of interesting literary things. The clothing chain Anthropologie has a tumblr where they post media that fits in with their overall aesthetic (though they don't necessarily contain Anthropologie products). Independent of explicitly business-run tumblrs, individual users often seek, like, and reblog commercial products, use tumblr as a way of finding new items to purchase, or blog images of particular products in order to contribute to their blog's overall aesthetic. 

What emotional relations might exist? Users of tumblr often develop relationships based on shared interests (self-portrait photography, fashion, poetry, literature, etc), or similarities in personality - it can be a way of finding friends online who share your interests. But tumblr's focus on aesthetics, and its ability to easily create a particular aesthetic (because of its customizability and the ease with which things can be reblogged to a site) or embody a particular mood, means that people are often attracted to blogs that communicate a particular aesthetic, mood, or identity (which is often represented by commercial products, and other peoples' music/photography) that they share or can relate to.

Why would it make an ideal case study? The relationship visible on tumblr between aesthetic/mood/identity and commercial products is relevant to the project's focus on ICTs tailored towards "emotional communication, resources, and identities, as part of a larger trend in marketing and commercialization." Also tumblrs are often used as extentions of other social media platforms, like YouTube channels or more "professional" websites, making it possible to explore emotional information seeking across online spaces, and the relationship between tumblr and other parts of the web.

Potential methodological challenges. Narrowing the 82.2 million blogs on tumblr to a manageable and cohesive sample seems to be the most immediately forseeable methodological challenge. Operationalizing notions of aesthetic, mood, and identity in the context of tumblr blogging would also require a lot of thought and care. 


Wednesday 21 November 2012

So I think that I need to share what I learnt from my peer review. I know, I am a little late with this, but I am sharing none the less. I did my peer review on President or Dictator? A Comparison of Cuban American Media Coverage of Cuban News. I think that my biggest problem with this article is that they did not tell the reader how they arrive at their percentages. They took for granted that the reader will just accept what they presented us with. As a result they severely compromised their validity and reliability - weakening their argument.

O. R. Holsti and R. P Weber were the two sources that I consulted vigorously while looking into Content Analysis. They were both very helpful resources and I totally recommend them (even if they are a little dated), especially since I have taken my notes from them and won't need to borrow them from the Inforum again. 

Tuesday 20 November 2012

Hine & Ethnography

I found Hine's article to be quite interesting, and I actually really enjoyed reading it. I think of all the research approaches and methods discussed thus far, Hine's attitude sounds the most "realistic" to me. What I mean by this is that her description of ethnography as a reflective process (p. 7) is a fair assessment, and I agree that while ethnographers may have preliminary thoughts on the direction their research will take, these thoughts must be constantly re-evaluated in the face of field experiences. I find her thoughts realistic because I think, in a way, they hold true for other methods of research as well. Take interviews for example. No matter how much time and revision and planning a researcher puts into creating their questions, until the interview begins the interviewer really has no idea what kind of questions will be important to their study. Often an interviewee will take the interview in a direction that the interviewer could not have predicted, and this can have a great effect on the researcher's findings. A research cannot really plan every detail of their study, they are more often than not "along for the ride".
Luckily, most researchers are experienced enough to have a firm grasp on realistic expectations of their research and are not truly taken out of their element. For the less experienced, however, almost any study can possess Hine's element of "wait and see" research. I appreciate the fact that Hine acknowledges the unpredictability of research in an ethnography and her article has opened my eyes to the benefits of an open attitude concerning social science research in general.

Monday 19 November 2012

Qualitative research and the Internet


I found the Hine reading to be particularly interesting. Hine points out that although the Internet could possibly expand the spatial structure of a research study, we are only able to engage a certain number of people (Hine, 2009, 18). She discusses the pragmatics of what a researcher could achieve which was helpful. Although I use the Internet on a daily basis, I struggle with how I would go about incorporating this into a potential research project method. One inevitable issue is the idea that some users can create an entirely “new” identity, which leaves us wondering if they are really being their true selves? Also, some feel safe when they “hide” behind these new “identities” and are more open in their comments, i.e. on forums, blogs, etc. On the positive, we could see this as creating greater diversity in the dialogue, it’s more representative. For example, those who are less likely to offer opinions in groups feel more comfortable in an online setting. One limitation though is a researcher is not able to physically see the participants and go beyond the written words. In this sense it is difficult to understand the context because we do not see the disposition and body language the same way as through face-to-face interviews for example. So the concern is how can researchers address these issues? Hopefully I’m not the only one thinking about this. Another interesting aspect that we touched on in class today is that the individuals you are researching may be apprehensive to give you information. As a result, the implication is that you really have to do a good job in proving you are doing authentic research and are able to maintain confidentiality.

Pinterest Ethnography

Our group chose to look into the first assignment option. For the sake of convenience, here's the original question:

You are an aspiring research assistant for a major social science project entitled “Emotional Information Seeking: ICTs and the commercialization of emotional and affective spaces online.” The project broadly investigates why ICTs are increasingly tailored towards emotional communication, resources, and identities, as part of a larger trend in marketing and commercialization.

At this stage it is too early to undertake in-depth qualitative interviews. Your supervisor has asked that you select a specific example of an online environment tailored towards emotional information seeking, which might serve as an ideal case study for the project.

Either individually or in small groups, pick any example, and prepare a 200 - 400 word explanation of the site/app. Include what it is, who it is tailor to, what emotional relations might exist, why it would make an ideal case study for the larger project, and (if applicable) the potential methodological challenges you will have to address. Post to blog.


Explanation:

Our group consisted of Chantel, Mike, Elison and me (Polina). We chose the "Pinterest" site as our case study. Pinterest is "an online pinboard" (taken from the site's description). It invites you to "share what you love". Basically, a user can create a profile where they post images that appeal to them, with the ability to add comments to the images. The profile can be linked to Facebook, Twitter, and other sites.
We considered the site to be a good case study for several reasons:
- There is the possibility of generating lots of rich data that's already somewhat organized by the user (through categories such as "to get in my closet" or "too cute for words").
- Since the project has to do with consumerism and commercialization, Pinterest is ideal as it demonstrates how stores infiltrate our social lives, and how our identities are shaped by the objects we single out. We even found an article advising stores on how to use Pinterest to generate sales!
- We liked the fact that there could be an interesting visual component to the research.
- Ethnographic sensibility is all about how people make meanings, and how these meanings are inscribed into their information environment. Pinterest is a perfect space for people to do just that.

Challenges:
- We have no access to what people didn't "love" - they only pin what they find agreeable.
- We don't have a complete profile of the user for the purposes of categorization (gender, age, socioeconomic status, etc.)
- It is very easy to impose the researcher's interpretation on the images - the users do not have the ability to give feedback (unless you contact them for more in depth interviews later on...)

*My own challenge would probably be tearing myself away from all the cute puppies photos... But that's just me. :)

Online/Offline Ethnography

Until I read Hine's article about ethnography and the internet, I only thought about ethnographic research online as happening online. I never really considered contextualizing internet ethnography offline as well. Hine, however, made it seem like perfect sense.

I really liked the example that Hine's used in regards to the research conducted by Max Forte (2005). I found it really interesting how he volunteered to develop websites in order to explain the cause of the resurgence of aboriginal identity in the Caribbean. I found it interesting that he not only helped to create the website as an outsider, most likely interpreting the needs of the community, and thus creating their online identity. Thus as Hine's article shows, many identities were brought together through this particular website.

I think that with the changing environment in regards to how we interact will change the way that ethnographic research works. This, for me, was a really interesting example that made me rethink how ethnographic research can be framed in terms of method.

Paradoxes of infrastructure


I've encountered Susan Star's work a number of times so far in this program, yet I still found it refreshing to revisit The Ethnography of Infrastructure again after having first seen it last year.

This time about, while looking at the "paradoxes of infrastructure" section I quite appreciated the idea of how apparently small barriers can become huge obstacles for people. I reflected on a recent experience where when starting the peer-review assignment for this class I decided to review some of the lecture slides. The Inforum computer I was at did not have flash enabled in Firefox (which is what I use by default), so I couldn't immediately view them. Rather than trying to find a work around I just reverted to my pen and paper class notes, and saved reviewing the slides for later that evening when I would have my own laptop that I knew could render the slides. The fix might have been simple, but stopping to figure it out would have been distracting from the task that I really had wanted to accomplish, which was thinking about the assignment.

Infrastructure examples from my work experience


Star’s discussion of how work processes can be completely disrupted my seemingly significant changes strongly resonates for me personally. Part of my work responsibilities has included developing procedures to standardize and simplify communications strategies. Conceiving of the second level of assemblage explains in hind sight some of the problems that I noticed with the procedures I drafted, even in my own execution of them. What is particularly interesting is when the values of the assemblage layer conflict with the goals being implemented in the more visible workflow layer. For example, in an attempt to provide more consistency in communications I built a series of semi-automated forms for staff to use. However, the time required to use the forms conflicted with the value of speedy turnaround of client requests. While these were all things that I noticed without the need of a deep infrastructural investigation, it does give me a good vocabulary to think and talk about it.

On an Attitude of Openness.

This course has helped reveal my research interests not just in terms of what kinds of research I might want to do, but also in terms of the methods I find most interesting. My blog posts often talk about interviews and narratives, and I especially enjoy the readings related to ethnography. This week, the chapter by Christine Hine on defining the boundaries of an internet research project, was no exception. My experience with ethnography so far has been limited to studying the "real" world (as opposed to the internet world? I know it's not really that clear a division), and understanding the use of ethnography for studying the internet was really interesting. Hine's appreciation for ethnography as a method was similar to mine. I like the fact that ethnography (especially in relation to information science) doesn't really limit what is considered "information" - the holistic nature of ethnography means that as a researcher, an attitude of openness is important, and it seems like, to the ethnographer, everything matters. Which makes defining the boundaries of your project important (and tricky!). Hine's comment about being "pre-disposed not to accept taken-for-granted ideas about what technologies can do and how they come about" (5) highlighted, for me, the importance of ethnography for understanding technologies, and for understanding the social realms online, especially when she discussed the bush pump, and the idea of technologies having an identity. She writes that "the identity of the technology, and thus where to stop and start studying it, cannot be decided in advance" (4), and although this makes the process of defining the boundaries of any study a challenge, it also highlights the importance of an attitude of openness when approaching research, no matter your chosen field. It makes me think of the issue I had with the Yin reading last week, which is kind of the same thing - this openness, whether in writing (in the case of the Yin article) or not, is a laborious thing, but seems to me the only way to develop a robust understanding of a case, or a technology, or whatever it is you're interested in researching.

Sunday 18 November 2012

Surprisingly un-boring infrastructure

I really enjoyed Star's article on the ethnography of infrastructure. I didn't think I would, since, as the article states, infrastructure seems boring. It's the nuts and bolts, the wires, the stuff that's invisible. But I found the following to be very true: "one person's infrastructure is another person's difficulty". It's interesting that, once again, a couple of the courses I'm taking intersect and discuss similar issues in the same week. In my Introduction to Reference, I just read an article discussing the way humanities scholars search online databases. The article, by Marcia Bates, is an old one, and talks about the very beginnings of computer database searching, but as I was reading it I thought that it is still the case! The databases we use, and in particular how they are constructed, are very much "science-y", requiring us to translate our thoughts into precise queries that the computer can process. I noticed that some databases now include a "visual search", perhaps to appeal to those of us with a more visual learning style. Incidentally  when I tried to do a "visual search", I was not successful. Or at least it looked like I was not successful, because the interface didn't change. But moreover, even if I were successful  the search would still require me to translate my vague, gut-feeling of a query into concrete terms, the more precise the better. This shows a definite bias towards science in the way we approach searching - everything is logical, standardized, codified, hierarchical. I'm not suggesting by any means that I know a better way to go about searching for information, but this is one very concrete example of infrastructure that most of us don't even notice (hence the transparency) until it becomes a hindrance and suddenly comes into view.  

Bush Pump


I particularly liked Hine’s discussion of the ‘bush pump’ and its ‘fluidity’ of identity. Wow. I guess I never thought of technology as having fluidity of identity before, ever. I understand the idea of our individual identity that is constantly changing, affected by the environment, external and internal. We talk about such concepts in gender studies classrooms. And yet, the bush pump, of which existence I have never heard before, just like humans, can vary and have flexibility in definition. Establishing a meaning of a bush pump requires putting it in a context. I find it fascinating and true. Markham’s example of how people view Internet drives the point home: users view the world wide web as “a tool, a place, and a way of being”.
Over the last few weeks all the readings we’ve had for this class made for me one shared point: research is interpretative. Hine, Markham, and Law confirm my theory with their discussion of methods in social science and how they “shape the ways in which it is possible for us to think”. In my previous post I wrote about the importance of having access to as many existing views and opinions as possible, only so one can progress with his/her own interpretation and develop an individual idea, perhaps, slightly advanced in the end. Sometimes these individual developments become precious scientific discoveries. Sometimes they are just thoughts. I am becoming convinced that any research, because it is interpretative, subjective, and has to be understood within the researcher’s context, leads to a progress of interpretation of ideas. That in itself is valuable. Yay, bush pump!

Wednesday 14 November 2012

Narratives as a Process for Analysis?

Yin's article metions that one issue with case studies is that researchers are commonly inclined "...to develop well-polished narratives for such items as individual interviews, specific meetings or other major events, logs of daily or weekly activities, and summaries of individual documents or reports" (p. 60), explaining that unless these individual narratives need to be published, they take too much time to produce and it's hard to decide around which topics a narrative should be organized, and how data should be integrated. This particular criticism made me think...  In various points in two of my other classes this term, we've discussed Grounded Theory (Glaser & Strauss), where writing up field notes in the form of memos is a huge part of the process of data analysis, and the point at which important codes and categories emerge, and can begin to be compared. Maybe Yin's criticism doesn't make sense because I haven't read that many case studies (and maybe they're commonly really poorly organized?) but his argument that constructing narratives in the process of analysis is a problem because it takes too much time doesn't make sense to me - yes, it takes time, but doesn't the process of writing, and of trying to understand how the key elements of an interview or meeting fit together, ultimately help a researcher to think through the realities of her case? And wouldn't putting together a narrative (especially at an early stage) help to reveal the important topics/the topics it should be organized around? Both Luker and Knight (and especially Luker) hammer home the point that regular writing is essential to the research process. I feel like more time you spend trying to think through your research in writing, the better (which is at least partly because I know this particular process helps me). 


A case for mixed methods

First of all, I am sorry this blog post is so late! Usually I post far earlier but, as I'm sure everyone knows, assignments are being handed in left and right. Still, I am glad I have time to discuss our class because I found some of it to give me a little more hope for research (in contrast to how disillusioned I was feeling about it last week). 
To start, I, like many other students, had no idea there was such a controversy over case studies. Before reading Yin's article, I had assumed they were just another research method to be considered. I am glad I read this article though, as it helped me to examine the kinds of research that may find case studies more appropriate. Through our class exercise, I was able to examine my research proposal in a new light and consider a new method of research (though I doubt a case study is in the books for me)!
The most relevant thing I took away from the last class was using mixed methods to inform each other. I was toying with the idea of using a mixed methodology in my (new and improved!) research proposal, but never thought of using them one after the other. In a few articles I read for my INF1230 essay, it was apparent that many studies based their results on two separate samples. Granted, both samples used the survey instrument, but the first round of survey results heavily influenced the second survey instrument. I am writing this from memory, so I hope it makes sense, but I really liked the idea of collecting data twice and using the results from the first to inform the second. I think I may use this approach in my proposal, but with a mixed-methods aspect, in that I would conduct a web survey then invite respondents to participate in a follow-up interview or focus group depending on the results. Before Monday, I had though of doing both surveys and interviews, but had not considered how I could use them to inform each other. 
I realize I am rambling now, but I am glad to have found a little bit of inspiration to continue slugging through my proposal! (Less than 3 weeks away, holy crap!)

Monday 12 November 2012

Case studies

From this week’s readings, what I found particularly interesting was the “Not Another Case Study” article. I enjoyed learning about why case studies exist and continue to be used specifically in the science and technology fields. Case studies can offer a helpful comparison for making differences more visible. (Beaulieu et al, 2007, p. 687)
According to the authors, the “middle range” is not about finding a “middle point” that links data and theory, nor is it referring to the “middle ground” between micro and macro level analyses. Rather, it is discussed in terms of carefully considering the situations where selecting ethnography as a method can lead to problems or successes in research findings. The aim is to understand the “changing relations between methods, concepts and empirical work.” (Beaulieu et al, 2007, p. 673) What this means to me as a new student in research methods is that just because a method is popular within a particular field, you still must carefully assess whether it fits your research question and is relevant to the research you wish to do.
Admittedly, when I first saw the word “e-science” in the article, I thought the authors were referring to electronic devices and technologies! However, the authors use the word e-science to refer to critical exploration science. They discuss the importance for researchers to be sensitive to diversity, explaining that one limitation of e-science is that it typically offers a “one-size-fits-all” thinking. An explanation is given on how such standardized tools could be difficult to apply in certain areas, like in the women’s studies example which is incredibly different from fields like molecular biology. Studying humans is not a linear process. Applying this knowledge to my own research proposal, this article also taught me about the importance of being flexible as a researcher, that is, if a method does not make sense it is perfectly alright to formulate a ‘hybrid’ methodology, which employs complementary methods. The goal is for your methodology to fit your research question.

Case Studies and CSI

Confession, I am a huge CSI fan. So when I read Yin's article, I felt like I related to his comparison to of detectives an academic case study, specifically, building explanations. It really seems to have the same twists and turns that my favourite corny Wednesday night show does.

(a) an accurate rendition of the facts of the case.
This is where, in a case study, the relevant data is collected in order to find out about the actual case, and what is happening/happened to explain a certain phenomena;
 OR, this is where Grissam (I know it's Ted Danson now, but Grissam was so much better, so I'm sticking with him), or another member of the CSI teams finds the body, or discovers a mystery, and the team collects evidence to try to build the facts of the case.

(b) some consideration of alternative explanations of these facts.
This is, when I am doing research,  I collect theories, and explore previous research (because my background is in history) to discover a framework in why and how the events being studied actually happened;
OR, this is about the time in the show where the detectives explore the different possible suspects. This is usually halfway through the show, and there is usually a twist (when writing a paper, however, I like to avoid giant plot twists halfway through).

(c) a conclusion based on the single explanation that appears most congruent with the facts.
This is where, after an analysis of the case study a result can be analyzed,
OR, Grissam catches the killer.

Obviously, this is a lot harder in real life, and different in academia. In my previous Masters degree, I studied crime in the American South, and really wish I felt like I was a detective when writing the paper. These steps are not so clear cut, and the results are much more complex; but, I think that these basic steps are important to keep in mind in academic studies.

On Evidence (archival-ish perspective)

A lot of people seem to be focusing on the problem of evidence from case studies, so I thought I'd comment on this too. Interestingly enough, I just had to do a presentation on evidence as an archival concept. There is a really good article by Brien Brothman called "Afterglow" that discusses evidence, and where it resides (and whether one can even use this sort of language when discussing evidence). Brothman's article is really long, but his main argument is that evidence is not something that is intrinsic to records, it is something that is negotiated after the fact by those who use the record. Therefore, the evidence that they find depends on what they are looking for. This actually makes perfect sense to me - in another class we're putting together a video about classification and subject headings, and as part of the video we interviewed an artist who looks through archival images of women and searches for evidence of suppressed lesbian identities (this is a really simplistic way of putting this, she articulates it much better). The point is that people can approach these images for different reasons and find different evidence in them based on what they are searching for.

So how does this apply to case studies? Well, I agree with a point that Mike made in his post where he says that it could be easier for researchers to find what they are looking for in a case study because their view is narrowed. On the other hand, couldn't we say this about any research method? Interviews and focus group sessions can similarly be conducted in a way that just proves what you're setting out to prove. The same thing goes for content analysis, and surveys... When we're dealing with such an amorphous field as social studies, it's really up to the researcher to ensure that they are not working with blinders on. And I guess it's up to the peers to critically review the work and poke holes in it, as unpleasant as it may be.

Scientific Or Not, Research Is Science


Both Yin and Beaulieu’s articles on case studies talk about the unstable reputation of this research method. They talk about the issue of scientists not taking case studies seriously. Yin cites Miles, when he presents the idea that case studies are merely “intuitive, primitive, and unmanageable” and “cannot be expected to transcend storytelling”. I think, so what.
Isn’t all research just storytelling in some way? Don’t we have to rely on the judgment of the practitioner and the idea that local circumstances may represent the whole? Focus groups, ethnography, participant observation –these can all be viewed as storytelling by a researcher. So what? After all, scientific progress relies on telling of stories and sharing of learning and knowledge. Would Albert Eistein do what he did in science if he were born just a few decades earlier and had exposure only to the information that had been available then? What if it is the combination of researchers’ opinions, judgments, and arguments at the time they reach a prepared and brilliant mind of some Enistein that leads the development of new successful theories?
Today, medical professionals are expected to subscribe to multiple medical journals and be read up on all the current research. Why do they do that? Not because every single research article completely solves a problem, is universal, and is a building block of theory. I think, the need for research is more about the need for developing dialogue. Many case studies are particular to their local circumstances, but after seeing a bunch of them on the same topic, we can progress further in thinking. Isn’t that what science is?

Civil Science


I quite enjoyed Robert Yin’s article on case studies. I found his analogy of the detective to be very instructive. This fits very nicely with my recurring theme of the obsession with making social science research appear “sciencey”. I very much appreciated his point about the importance of the investigator’s intuition, though I can see the sciencey people getting up in arms about that. However, I would like to extend the analogy even further. Within law there are two types of “truth”. In criminal court the standard of proof (truth) is “beyond a reasonable doubt”. This, if you will, is analogous to the hard sciences standard. Many in the social sciences seem to expect this level proof from their class of research. I would rather see the civil standard of “on a balance of probabilities” applied as the expected standard. The fact that it demands a “lower” level of certainty does not make this necessarily less useful at getting at the truth (See OJ Simpson). Given the complexity of many social science subjects, the need for intuitive input from the investigator, I think a balance of probabilities is the appropriate standard by which to judge most social science research.

Case studies


I'm not terribly familiar with case studies as a research strategy, so it was a bit of a struggle for me to wrap my head around what is being said in this week's readings (although, this has been a phenomenon that I've encountered many, many times in the last year here). I liked Beaulieu et. al's "middle range" reflection on it, where, if I'm understanding it, it "generalizes" in the way that it can provide a rich set of empirical data that can be used to either affirm or show gaps in an existing theory/conceptual framework. To me, at least viscerally, this seems reasonable: you aren't really going to get a "grand theory" out of it, but it can illustrate how something plays out in practice.

Nevertheless, the readings bring up a number of problems and I'm not sure I buy how they are addressed. Yin describes how researchers can easily become bogged down with too much data, and suggests narrowing one's focus to "meaningful" events. In a way, this seems slippery to me in that if you are too determined to look for what you are looking for, you risk finding it. I also find Yin's response to the problem of participant objection to the final report kind of fishy. I mean, if you were to show them aggregated data, it seems obvious that they would have a harder time disagreeing with it because they know it's not *just* their data. I'm not quite sure that justifies ignoring the problem.

Wednesday 7 November 2012

Peer Reviews: So What?

Having submitted my peer review and it being fresh in my mind, I’d like to offer some more thoughts on this. I’m probably not alone in asking “so what?” What’s the point in actually doing a peer review? What kind of impact can I possibly have? To the researcher? What about to the rest of society? Well I came across a newspaper article on the discourse on wind turbines and their effect on our health. One camp (the Ontario government) claims that wind turbines don’t have any adverse health impact on our health and they cite a report by Dr. A. King, Ontario’s Chief Medical Officer of Health. No information is given on what sources are used in this report, but I suppose we can assume it is not peer-reviewed. On the other hand, those opposed to wind farms contend that they do have an effect. To substantiate their claims they utilize the controlled, peer-reviewed, scientific study published in the Noise & Health journal which for the first time links industrial wind turbines to serious health problems. I think sometimes as students, when we’re doing critical reviews, we forget that these articles, especially when peer-reviewed, can be used to improve conditions in the world around us. I’m beginning to realize that policy makers and special interest groups do have an important duty to citizens to use peer-reviewed studies when deciding on a specific course of action, especially when it could impact our health and daily life. I really don’t think researchers write their articles just for writing sake, for selfish purposes, or for them to be created, stored and forgotten in the world of cyberspace never to be read again. Reading this reminded and reassured me about how significant peer reviewing is and that it can contribute to making our world a better place to live in. If used for ethical purposes, peer-reviewed articles are a kind of “weapon” against decision-making that is based on wishy-washy, questionable “studies” that ultimately only serve one interest group.

Monday 5 November 2012

Literature Reviews

I'll be honest, Luker kind of bugs me. While she apparently no longer agrees with canonical social science method, she is still describing a very structured approach to research. She really seems to think that her way is the best way to do research, but, at the same time, the more I read of her text, her 'salsa-dancing' approach does not seem to be drastically different from what I have learned in previous methods classes.

In chapter 7 of her book, though, I found myself really liking her description of historical/comparative methods in order to interpret the data collected. I think I liked this part of her text the most because I could relate to it based on previous research projects I have done, and this was always my favourite section of papers to write. Literature reviews, I think, are really important in a research project because it situates you within the literature of your subject. When Luker asks: "What pieces of would convince someone who is not already predisposed to agree with you that your argument about what this is a case of is compelling?"(p. 143).

One thing I think that Luker missed her discussion, is that, in my opinion, it is also important to find studies that your study disputes, in order to keep the academic discussion going. If you disagree, or want to tweak theories by other academics, and it's important to your study, you should do so. This, I think, is important to answer the the question I quoted. Whether you are agreeing or disagreeing, it is important to explain why and how you disagree. I think that this could have been more explicitly explained by Luker in this chapter.

What's the point of research, anyway??

Knight's chapter this week frustrated me, as I am beginning to feel almost overwhelmed by all the ways one can conduct research and all of the ways these methods can produce inaccurate results. I appreciated the different research methodologies discussed in this chapter, specifically Knight's discussion of images. I, perhaps rather naively, had not considered the many implications and possibilities for research that are contained within an image. Reading this chapter forced me to reconsider previous methods of research, like interviews, that had once seemed so straightforward. Knight mentions that images in research can be problematic because image-research is multi-layered, and must be made sense of by examining the context of the image production and the intended audience (p. 104). When I first read this, I thought that image-based research may be more trouble than it's worth, because it is easy for two people to interpret a single image differently based on their own biases, even given the context of the image's production. Then, I started thinking back to our class lecture and discussion about bias in interviews, and realized that interviews may not be much better! There is no way for a researcher to know if the interviewee's responses are being influenced by anything from the halo effect to even something as small as how the interviewer dressed! This combined with Knight's discussion about the challenges of image research left me slightly disillusioned and cynical about research in general. It is just so hard to prove someone's research is correct when it depends so much on interpretation: both how a subject interprets the study and how the researcher conducts the study.

I obviously realize that research is not just a big guessing game, and that it really is more technical and structured than I am giving it credit for here, but today I feel cynical! With all of these methods one has to wonder about all the inconclusive aspects of research. Maybe I'm just feeling overwhelmed due to the research proposal looming in the distance. Can anyone help me out of my research rut?

Data reuse


I quite enjoyed reading about post-empirical research in this week's Knight reading, and noticed that it overlaps with the ideas of data curation and sharing discussed in the data librarianship course that I am currently in. It seems that the idea of data reuse and sharing is relatively new as it stands, and small-scale researchers have generally focused  on producing and disseminating articles/papers or other textual accounts rather than on the data itself as a product of research. Unfortunately this means that while small-scale researchers have collected huge amounts of data, it is largely unavailable for anything beyond the original research project even though it could be useful for further conceptual analysis.

Of course this raises quite some issues. One from the reading that raises a red flag for me is the claim that statistically insignificant results could be combined to reach more powerful findings. While I like the idea, it seems like it would be tough to do this while avoiding disputes that the results are invalid because even minor differences in methodology are thought to make data sets incomparable.

Sunday 4 November 2012

Quadrangle Squabbles

The main tension I picked up on during this week’s readings (and the stuff I was reading for the peer review assignment) was various camps arguing over who gets to use the term “Content Analysis” for what types of research. Most people seem to see some in both qualitative and quantitative methods as having something to offer. The main controversy seems to be if “content analysis” should be a big tent encompassing all sorts of textual analysis, or if the qualitative people need to go start their own tent. On a related topic, based on Jesse’s comment last week about Luker’s position on Content Analysis, I found an endnote in her book (p281) that explained how Linguistics and Political Science still (surprisingly) used content analysis, but that it was pretty well dead in sociology. This seems to amount to varying research fashions between disciplines. As an ends justifies means type of person, I really don’t have much patients for these types of debates (unless they are entertaining). Personally, I find these controversies a distracting sideshow and I really don’t care. What I want to know is if the researches they are publishing tell me something new, or makes me think about something in a different way.

Literacy With An Attitude - a must read!!!

There is one book from my undergrad studies in education that I keep referring to at least once a year with regards to some other, seemingly totally unrelated, issue or conversation. I've recommended it to all my friends and family, and those who read it thought it was fantastic. As I was reading the Van Dijk article, I knew that this was yet another one of those times when the book is so amazingly pertinent to the discussion. Perhaps some of you read it already - I know we have some teachers in the class. It is called "Literacy with an Attitude" by Patrick J. Finn, and it's all about the "(re)production of power" through all sorts of discourse. In fact, it is so relevant to Van Dijk's discussion that I was sort of surprised there was no mention of it, or of Paulo Freire, the amazing brazilian education whose theories and actions are a major part of this book.

Essentially, Finn demonstrates, by referring to several studies, how education (meaning teachers' and parents' discourse) contributes to the continuation of class segregation. Lower class kids are being taught in a way that will train them for the kind of jobs they are supposed to have - blue collar, factory workers, obeying directions, very limited outlet for creativity or initiative. The same things happens with middle class kids, upper middle class and upper class - each is taught with the aim to develop the skills that they will need later on in life. Often this divide happens in the same classroom, with the same teacher! And the parents are usually (sometimes knowingly and sometimes unknowingly) enablers and supporters of this class perpetuation!

 I do apologize for the multitude of exclamation marks, but when I first read this book I finally got a glimmer of understanding about this huge question that always bothered me - why can't some kids just do better? After all, schools often invest tons of money into all sorts of programs and gadgets to help the underperforming kids. But without underlying empowering discourse and without the kids themselves "buying in" in Finn's words - and this is where Van Dijk's "social cognition" comes in - there will be no real change.

Read the book. Really.

Peer Review Example

I should have thought of this earlier, but perhaps late is better than never. I was also struggling with the peer review project this week, mainly in terms of structure, but also with regards to how to word my comments. I didn't want them to sound too harsh, but I wanted to help the author publish a better paper. It's a very fine line, sometimes, between constructive criticism and just plain criticism, especially in a paper where I feel a lot needs to be reworked... So I googled "peer review example", and found this:

http://www2.etown.edu/docs/psychology/samplepeerreview.pdf

Though this one is in point form, whereas we're writing in paragraphs, and this one has to do with surveys, whereas mine is a focus group research, I nevertheless found some ideas useful - I forgot to look for information regarding consent forms, and this example reminded me of it. I also found it useful with regards to the tone - the reviewer is very clear as to what should be looked at, but some of the comments are more like questions. It would seem to me that this will soften the blow for the author of the article, while at the same time clearly pointing out a problem to consider.


On Peer Reviewing

I have been more than a little preoccupied with understanding the use of interviews this term, so for the peer review assignment I've chosen the article on emergencies ("Bridging and Bonding in Emergency Management Networks"), which used interviews, in the hopes that it will help me refine my understanding of how and why to use them as a research method. I really like what Luker wrote in her chapter on interviews, about how while interviews don't allow you to capture some sort of objective snapshot of "reality," they do give you insight into what is going on in people's heads, they ways in which they understand things, and how the world looks from their perspective (pg. 167). It is this insight that I have taken with me into the peer-review process - as I try to evaluate the article on emergency management networks, I've been thinking a lot about how interviews helped (or didn't help?) the author to learn what he wanted to learn.

The peer review process has been challenging for me so far. Like others have mentioned, it's been difficult to be aware of when I am giving feedback in the hopes of making the article better for the author, based on his own intentions, and when my feedback veers toward trying to make the article something I would have written myself. I really like the suggestion Michelle made a few posts below - I would have found it helpful to walk-though a peer review (of an article different from the ones assigned) as a group. As it was, though, the workshop was helpful, and I left feeling a little more at ease about completing the assignment.

Discourse Analysis - Risky Business


I found it interesting that Dijk in her essay, “Principles of Critical Discourse Analysis”, discusses the relationship between discourse, power, dominance, social inequality and the position of the discourse analyst. Dijk calls this study domain “multidisciplinary” and it’s true – so many aspects of social structure are involved in defining dominance – economical, political, cultural, etc. I am surprised to hear that not a lot of research has been done on the role text or talk play on creating the dominance of elites.
            Second of all, Dijk made me think about the American-Cuban article that I am peer reviewing this week. The authors of the article had also set off to use discourse analysis as a method to identify the role media plays in American-Cuban population’s views on Cuba. When reading the paper for the first time it struck me that the question the researchers pose is extremely complex. Now I have the proof that it, in fact, is complex.  The authors of my peer review paper, like Dijk, need to think of many aspects of social structure – economical, political, cultural - if they want to answer a research question on the role media, or any discourse, plays in setting people’s views. Dijk very carefully offers disclaimers, such as “the relationships involved and the conditions on reproduction are complicated” while the Castro article authors do not. In general, I think discourse analysis is a risky undertaking and has to be carried out with multiple considerations of meaning. Otherwise the data collected is just raw and unconcerned quantity.