In sociological research there are moments of chance and serendipity in which something happens that moves a project or one’s thinking into an unexpectedly fruitful direction. An event is witnessed, a phrase heard, a paper read, a person met. Suddenly an idea sparks into being or things are cast in a new light. Such moments of chance are in part a product of the messiness of research and of the way in which the world, so well-studied for so long, can still surprise us.
In the Morgan Centre we are quite fond of mess and disorderliness and we have been experimenting with different ways of knowing about the world that take more notice of its chaotic and surprising features. For the most part, however, sociological methods tend to emphasise orderliness. They pull things together, search out patterns, organize themes, categorise, classify and compare. You can see this in the design of qualitative and quantitative data analysis tools, for example, which often embed certain frameworks for coding, interrogating and representing data that presume a certain sense of structure and hierarchy. Our published findings also adhere to certain conventions, sometimes borrowed from the natural sciences, so that most journal articles are much the same, at least in terms of presentation of the argument and the data.
When Lynne Chapman, resident artist in the Morgan Centre, first began workshops with us she immediately set about trying to change our relationship to order, patterns and structure. She encouraged us to ‘let go’, take a chance and see what happened when we played with the paints, pens and pencils we had newly acquired. This was difficult for me, since I am not a natural artist and being bad at things is an uncomfortable feeling for most people. When I put pen to paper what I draw does not look like the thing I can see in front of me. Hence, my first forays with the freedom of the blank page produced rather uninspiring results.
But Lynne’s enthusiasm has been unfaltering and we have engaged in a range of different activities designed to make us comfortable with the fact that our representations do not look like the real thing. One example was the use of ‘wrong-hand portraits’ which forced us to abandon any hope of making a realistic representation of our subjects.
Eventually this started to have an effect on how I approached painting and sketching and I believe that I am starting to understand a bit more about how an artist like Lynne might observe the world and how they combine skill and serendipity in their engagements with it and representations of it. Sploshing paint about, drawing without looking, combining paint and pen and pencil has ‘freed up my hand’ as Lynne might put it.
The results are much improved. Of course, this is partly due to practise. But it is also due to letting go of certain constraints I had placed on myself as a novice. By learning how to make use of the limited skills that I am developing in combination with the chance afforded by the materials I am using, I have begun to feel unburdened by realism. I’m also trying to steal some of Lynne’s techniques of annotating sketches, using certain pens and pencils, and sketching quickly to try to capture some of the movement in everyday life.
Intellectually, this embrace of chance and serendipity is familiar and reminds me that an important feature of creative methods in sociology is that they are more adept at picking up some of the multi-layered nature of social reality than are standard survey techniques or semi-structured interviews. They too can capture some of the movement of everyday life, the way it doesn’t fit within boundaries, colours outside the lines, and yet holds shape, has some order and consistency.
Secrecy has long been a part of scientific and innovation practices. Being an ethnographer of laboratories, one occasionally comes up against a barrier of entry to a secret lab or space within a building, protected by intellectual property agreements, military or government contracts. Of course, military science is often conducted in secret, on nuclear, biological or chemical weapons, amongst other things. In his excellent book on ‘Secrecy and Science’, Brian Balmer describes how the Manhattan Project epitomised the way in which scientific secrecy operates at various levels of social organisation:
It was, in fact, an almost unprecedented organisation of not only scientists, but also industry and military. Moreover, a significant feature that accounts for the success of the Manhattan Project is the preoccupation with secrecy at the various sites involved in creating the atomic bomb. Compartmentalisation, telling people information on a strict need-to-know basis, meant only a few people had a complete overview of the project. […] In this manner, efficiency, security, bureaucracy and secrecy all came together at once. (Balmer, 2013: 8)
By their nature, it is often the most controversial, risky and ethically dubious research programmes that are conducted in secret, curtained-off from society in order to protect knowledge and technology not only from public scrutiny but also espionage or corporate theft. Thus when we find out that science has been conducted in secret we are generally right to be suspicious, and it should be no surprise that a meeting convened earlier this week, behind closed doors at Harvard, on the prospect of synthesising the human genome, has caused a stir.
The meeting was convened to discuss the prospects of coordinating a large collaborative venture to follow-up on the Human Genome Project (HGP), that would, over the next decade, seek to construct an entire human genome in a cell line. Currently unfunded but to be prospectively titled ‘HGP-Write: Testing Large Synthetic Genomes in Cells’, it is backed by some of the biggest names in the field.
As the New York Times reports the meeting was invite-only and “The nearly 150 attendees were told not to contact the news media or to post on Twitter during the meeting.” In this regard, it would seem that scientists hosting the meeting wanted for the event to be part of what we could conceptualise – following the sociologist, Georg Simmels’ well-known work on secrecy – as synthetic biology’s ‘second world’. As Simmel argued:
Secrecy secures, so to speak, the possibility of a second world alongside of the obvious world, and the latter is most strenuously affected by the former. Every relationship between two individuals or two groups will be characterized by the ratio of secrecy that is involved in it. Even when one of the parties does not notice the secret factor, yet the attitude of the concealer, and consequently the whole relationship, will be modified by it. (Simmel, 1906: 462)
A second world for synthetic biology is probably quite appealing to scientists working in the field, a space in which they could run-wild with their ideas without the worry of what a supposedly fearful public might think. Synthetic biologists, for the large part, expect the public will be inappropriately scared of developments in the field. This has led to what Claire Marris (2015) calls ‘synbiophobia phobia’ – the fear that scientists have that the public will fear their work.
Synbiophobia phobia might be at the root of the decision to hold the meeting in private, as the organisers likely anticipated public fear at the potential of creating a human genome from scratch. But, as Simmel’s notion reminds us, no matter whether parties kept in the dark find out about the secrets being kept or not, the effect of secrecy is to change the attitude of the concealer and consequently the whole relationship between scientists and civil society.
Contrary to some scientist’s reactions to the media response to the closed meeting, secrecy in synthetic biology isn’t just a fiction created by newspapers and magazines to whip-up a story. The field does have at least the beginnings of a second world, divorced from public scrutiny, then it is almost certainly going to be tied to the Defense Advanced Research Projects Agency (DARPA), which has had a keen interest in the field since its fledgling years and has invested tens of millions into synthetic biology under the remit of the Biological Technologies Office.
However, speaking to the NYT, George Church, one of the most prominent advocates of synthetic biology and co-organiser of the Harvard meeting, argued that the event had been misconstrued and that the secrecy was actually about protecting a paper currently under review that, if published, would make the ideas for the project publicly-available and thus transparent. But as the invite read, “We intentionally did not invite the media, because we want everyone to speak freely and candidly without concerns about being misquoted or misinterpreted as the discussions evolve.” Whatever the motivation for the closed-doors, invite-only meeting, the effect of concealment might well be the same: it implies that something suspicious is going on.
In this regard, the scientists have shot themselves in the foot. The meeting will worry people, even those who support synthetic biology in general. In fact, one of the most well-known advocates for synthetic biology, Drew Endy, refused to attend and co-authored an open letter criticising the closed meeting. It is only a matter of time until those more critical voices and outright enemies of synthetic biology seize on the secrecy of the meeting as further evidence of untoward ambitions for the field. It would be a mistake, though, to see this as unwarranted fear and ignorance. It has much more to do with the facts of synthetic biology and how it is being developed in relation to corporate interests. As Endy and Zoloth’s (2016: 2) letter argued:
The creation of new human life is one of the last human-associated processes that has not yet been industrialized or fully commodified. It remains an act of faith, joy, and hope. Discussions to synthesize, for the first time, a human genome should not occur in closed rooms.
Two of the common tenets of the emerging frameworks for responsible research and innovation, which has been closely tied to the development of synthetic biology, are the importance of scientific transparency and of deliberative governance processes. The UK Synthetic Biology Roadmap, for example, includes a commitment that the Synthetic Biology Leadership Council should “should provide an exemplar of openness and transparency with two-way stakeholder engagement as a core principle.” (SBRCG, 2012: 32)
But transparency is easier invoked than it is implemented. If scientists are going to take responsible research and innovation seriously, then actually implementing transparency and deliberation is going to be crucial, especially when the choices about such things are immediately within their control, as was the case this week. A second world for synthetic biology might be appealing in principle, but in practice it risks bringing about exactly the kinds of public fears that scientists and engineers worry about.
Balmer, B. (2013) Secrecy and science: A historical sociology of biological and chemical warfare. Surrey: Ashgate.
Marris, C. (2015). The construction of imaginaries of the public as a threat to synthetic biology. Science as Culture, 24(1), 83-98.
Simmel, G. (1906) The sociology of secrecy and of secret societies. The American Journal of Sociology, 11(4), 441-498.
Here are three questions that natural scientists and engineers often ask me, and which are commonly asked of social scientists participating in interdisciplinary collaborations.
Why do people not like X?
e.g. why do people not like synthetic food additives?
This question usually has to do with a perception that scientists, or a particular field of scientific work, is not viewed favourably by ‘the public’ or by industry, governments, NGOs, and so on. Natural scientists and engineers sometimes have the impression that social scientists will be able to explain why it is that their technical ambitions have been thwarted by a political misconception or misunderstanding.
The question is sometimes embedded in a range of other faulty assumptions about public understanding of science and science communication. As a question, it can position social scientists as experts in the irrational behaviour of individuals and groups. Social scientists can also be positioned through this style of questioning as brokers, and so be expected to help to open governance and/or public doors.
I’d rather talk about:
What do people actually say and do in relation to X? What practices are involved in regards to X and how do you envision X changing, supplementing or supplanting those practices?
When it comes to something like food additives, for example, I’d be interested to know how people make sense of ‘synthetic’ and ‘natural’ from within cooking, eating, feeding and caring practices. We could ask in what ways are these concepts important to the organisation of such practices, or in what ways do these practices organise our demarcations of those concepts? It could be important to explore how novel methods of producing food additives sit alongside or displace existing methods of production, and with what global socioeconomic implications.
In this regard we could begin to understand why the notion or production of synthetic food additives might raise socio-political, economic or ethical questions from publics, NGOs, governments and so forth, rather than assuming people don’t like it because of ignorance.
What will people think of X?
e.g. what will people think of brain-based lie detection?
This type of question usually has to do with natural scientists’ worries about what ‘the public’ will think about their planned innovations or technical recommendations. It comes from a recognition that publics are interested and invested in scientific knowledge production. However, it is often motivated by a desire to ensure that people will think positively about X and so sometimes accompanied by a will to understand how to encourage people to feel positively about X.
The question is sometimes embedded in a range of other faulty assumptions about how people’s negative feelings about proposed technologies will inexorably lead to market failure and that this is necessarily a bad thing. It positions social scientists as PR gurus or pollsters, who can help to take the public temperature and design marketing strategies for innovations.
I’d rather talk about:
What kinds of imagined (necessary, likely, possible or peripheral) actions, routines, relations and social structures are being embedded in the sociotechnical work being conducted? In other words, putting the emphasis not so much on the object of technical interest but on how the envisaged object might impact upon existing practices of life, relations and social order, or open-up new practices. Do we want these kinds of practices and why/ why not? What effects will these changes have and with what implications for people’s experiences of social and technical phenomena?
In the given example of brain-based lie detection, I would want to know more about how we do lie detection now in different contexts and why we do it that way. I’d be interested to know how brain-based techniques would change these contexts if implemented and what implications such changes might have for our ways of living with each other. If we were talking about using brain scanning as evidence for use in legal proceedings, for example, we’d have to think carefully about why we currently use the investigation, interview, interrogation and jury systems. How would brain-based technologies fit in these practices and what would change? What kinds of life and forms of justice are implicated in such changes?
In this regard we could begin to unpick some of the tangle of concepts, practices, norms, politics and so on that are bound up with our current ways of doing lie detection and thus better understand what would be at stake for someone who is asked to give an opinion on a new lie detection technology.
Is it okay to do X?
e.g. is it okay to engineer life?
This question usually has to do with a perceived ethical ‘implication’ of some proposed technical innovation. The questions often centre on technical objects. They involve a recognition that sociotechnical innovation generally implies changes in how things are done or understood. They might have to do with abstract concepts like life or nature. However, by emphasising the objects of technical innovation or abstract questions these kinds of concerns largely miss the everyday practices that are at the heart of how ethical decisions and dispositions are made and formed.
This type of question is sometimes embedded in a range of assumptions about scientific objectivity and how ethical implications arise only from the implementation of knowledge and new technologies in the world rather than in the practices of knowledge production itself. In addition, such questions often come with the implication that X is going to happen anyway, but it would be good to know what moral status it is going to be given when it does happen.
This style of questioning is more comfortable for some social scientists than others, since some of us are experts in ethics. However, in the way the question is generally posed it positions social scientists as ethical arbiters, who themselves are being asked to judge the moral status of objects and so assess the social value of proposed innovations in order to help scientists justify actions they know they are going to take. This is a bit tricky and can be a far less comfortable space to inhabit.
I’d rather talk about:
What kinds of moral or ethical values are embedded in the scientific practices out of which the question has emerged? In other words, what has been decided about ethics already, what kinds of questions have been closed off and with what justification?
I’d also be looking to explore what kinds of ethics and norms are used in contexts in which the proposed X is being invoked. Are there differences in the ethical frameworks used to think about X across different spaces and times and in what ways do these differ? If there are differences of opinion about the ethical valence of X how do we decide amongst such opinions in governance, regulation, and technical innovation practices?
Understanding dementia and its entanglement with everyday life presents a conceptual and methodological challenge to a range of disciplines in the humanities, health and natural sciences. In this day of academic seminars, we explore some of the work being conducted in humanities and health research to examine this topic, focusing on the creative approaches that are being developed to tackle questions of selfhood, relationality, materiality and narrative.
The event is co-hosted by the Morgan Centre for the Study of Everyday Life, the Dementia and Ageing Research Team and MICRA, the Manchester Institute for Collaborative Research on Ageing.
Nearest train stations: Oxford Road and Manchester Piccadilly, both around 10-20 minutes walk from the Jean McFarlane Building.
Dr Andrea Capstick and Dr Katherine Ludwin, School of Dementia Studies, University of Bradford
Dr Christina Buse, Department of Sociology, University of York
Dr Lucy Burke, Department of English, Manchester Metropolitan University
Dr Jackie Kindell, Specialist Speech and Language Therapist, Older People’s Mental Health Service, Pennine Care NHS Foundation Trust.
Early Career Speakers
We have a few opportunities for early career researchers (including PhD students) to present their work and some limited funding to cover their travel expenses. If you are working on dementia and everyday life phenomena, particularly if you are using creative methods, then please consider putting forward a proposal to speak. We are open to creative suggestions for format, although ECR speakers should bear in mind that their slots will likely be limited to 15-20 minutes. To apply, contact the organisers with a suggested title and abstract of no more than 300 words by 21st March 2016. We will inform successful applicants by 23rd March.
This is a small event and is open to academic researchers working in the field of dementia and everyday life, or related areas.
My Morgan Centre colleague, Vanessa May, and I have received funding from the North West Doctoral Training Centre for a scholarship for a PhD student to study dementia and friendship, in partnership with Manchester Carers Forum. The information is below, please encourage any prospective students to apply.
The University of Manchester’s Morgan Centre for the Study of Everyday Lives is offering one fully-funded ESRC CASE PhD studentship in partnership with Manchester Carers Forum. The funding includes payment of tuition fees as well as a doctoral stipend at the UK Research Council’s required level of £14,057 per annum. The studentship is available to outstanding candidates wishing to commence their doctoral studies in September 2016.
Project description: In the past decade, friendship has become a concern in sociology as well as in anthropology and related disciplines. This PhD project will examine questions of friendship in the context of dementia, focusing on the experiences of people with dementia and their carers, as well as their friends. A crucial feature of this study will be to take seriously the relational perspective, understanding changes in the lives and friendships of carers and people with dementia as being fundamentally entangled. The project will develop a variety of elicitation and sensory methods (for example photo or music elicitation), as well as more conventional narrative and biographical methods, for the study of friendship and dementia. As an ESRC CASE studentship, this project will also involve close work with Manchester Carers Forum. As part of the funding requirements, the successful candidate will volunteer at Manchester Carers Forum as a member of the peer mentor coordination team, working directly with carers and peer mentors to support people living with dementia. The PhD student will work for 3 months of the year at the Carers Forum, broken down into a certain number of hours per week.
Studentship Details: The successful candidate will be supervised by Dr Andrew Balmer and Dr Vanessa May in the department of Sociology. It is anticipated that the studentship will be for direct entry onto the three year (+3) PhD programme in September, however candidates for the 1+3 (MSc. Sociological Research followed by the 3 year PhD programme) will be considered. Continuation of the award is subject to satisfactory performance.
Entry Requirements: Applicants must hold a Bachelors First Class or Upper Second Class Honours UK degree in a relevant social science discipline, which should generally be in Sociology or Anthropology, although other disciplines will be considered. Candidates must also have (or expect to gain before the start of the programme in September 2016) a UK Masters degree (or overseas equivalent) recognised as a research training masters by the ESRC. They should be qualified at minimum Merit level, with a coursework/examination average of 60% or more. Students without a Masters degree (intending to enter the 1+3 programme) will be considered but preference will be given to those with a Masters qualification. You must satisfy ESRC UK residential criteria to qualify for this studentship (information here: http://www.esrc.ac.uk/skills-and-careers/studentships/prospective-students/am-i-eligible-for-an-esrc-studentship)
Candidates meeting the following criteria will also be given preference: above 70% in their Bachelors or Masters; some demonstrable knowledge of the sociological literature on personal life and/or friendship; demonstrable expertise in qualitative research methods, particularly creative methods. Enquiries should be directed firstname.lastname@example.org.
Deadline: 7th February 2016.
How to apply
Applicants should email Dr Andrew Balmer, Andrew.email@example.com with a full CV (including grade transcripts) and a covering letter explaining your interest in the project. Please note that applying for this PhD studentship funding is a separate process to applying for entry to the Manchester PhD Programme. The successful candidate will therefore also be required to fulfil the normal admissions procedures for the School of Social Sciences once they have been offered the NWDTC scholarship.
Some definitions of ‘affect’ hold it to be pre-linguistic, something fundamentally ‘non-representational’ (Massumi, 1995), like the sensation of anxiety conjured by a particular urban environment.
Affect in this guise is automatic. It is something the world conveys upon us as if by magic. The brain and wider nervous system are often crucial to such arguments, since they react to stimuli so rapidly it is easy to see how unconscious some of our responses can appear. In contrast, Margaret Wetherell (2012) understands affect to be entangled with all the rest of the mess of the world. Something that can happen in the blink of an eye, something embodied and habitual, yes, but also something that we designate to ourselves and others as part of situated everyday life. Affect is something that we reflect on, foster or discourage. It is structured and also specific.
In my own work I have been musing on affect and emotions as part of studying a very messy situation: what it is like to be caring for someone with dementia. Primarily I have been interested in how we deal with change in this context. No matter which kind of dementia a person is living with, there will be a lot of change involved, not least in their behaviour, but also in their relationships and in their capacities. One form of dementia is particularly pertinent to understanding affect and emotion. Behavioural variant frontotemporal dementia (bvFTD) involves a range of symptoms, but central to its manifestation are changes to a person’s affective disposition. People can become disinhibited and lack shame, empathy and insight. They might cry or laugh uncontrollably. Sometimes their tastes change, particularly as regards their appetite for sweet and sugary foods. They can also become obsessional, repeating routines and behaviours without relent.
For example, arranged around the living room of a carer I interviewed, Mike, there were several electronic drum kits. However, Mike told me as I we walked in that he doesn’t play the drums. They were for his wife, Lucy, who was living with bvFTD. Lucy would repeatedly bash and drum on anything she could find. Mike had bought the drums because at least they made familiar sounds, had a volume control and weren’t easily demolished.
This led me to ponder on whether Lucy drums things because of a change in her brain. Certainly there is a neurological problem causing a disruption in her everyday life. But why drums? This was a question Mike regularly posed to himself and doctors. Some answers he received were that it might stop some unpleasant sensation that Lucy feels, or that she might derive some pleasure from the physical activity, from the sound or from the effect it has on others. Mike wonders if she’s angry, and says that she doesn’t show any empathy for him as he struggles to tolerate the endless cacophony. And Mike struggles to manage his own anger as he soldiers on. But why do the drums bother him, exactly? It seems that there’s certainly an element of automation here. The drums make an unpleasant noise, which makes him feel angry even against his will. But how do we judge a pleasant noise versus an unpleasant noise? There are physical factors. Some noises hurt our ears. But cultural ones too, having to do with the way in which rhythm and melody has been structured in the West. These physical and cultural factors also inform each other.
And surely Mike is also angry because of the sense of injustice he feels. That dementia has affected Lucy in this way and that she does the things that she does. And that he is losing her. He is still angry with her even though he knows this, which makes him angry with himself, and further angry with the disease and that he can’t do anything about it.
Clearly there are multiple forces shaping the manifestation of anger in Mike and Lucy’s relationship, each time situated, specific and multiple, but also part of a broader story of changing embodiment, capacity and everyday life, one that is at least partly shared with others living with bvFTD and their carers.
Encounters like these with changes in affect lead me to believe that we need to better understand its entanglement with the body and the brain, certainly, but that such an investigation has to be conducted from within the relational world in which these changes take place.
Massumi, B. (1995) ‘The Autonomy of Affect’, Cultural Critique, 31, 83-109.
Wetherell, M. (2012) Affect and Emotion: A New Social Science Understanding (London: Sage Publications).
iGEM is promoted to undergraduate students as an exciting and playful competition in which you get to create a cool new organism. The jamboree, for example, is organised as a way of performing this enthusiasm, through the way in which trophies and medals are awarded, but also through the parties, and workshops and all the photo opportunities.
Lots of the concepts embedded in the idea of iGEM are borrowed from the world of software engineering and computer gadgetry. Take ‘biohacking’, for example. This set of concepts, ways of thinking, images and so on also relate to certain values of judgement and decision making. Trying to make something ‘cool’ or ‘exciting’ pushes thinking and design in some directions and not others. It influences the choices we make and how we evaluate our work.
Which is cooler? Designing an organism that uses bioluminescence to signal air quality, whilst also removing pollutants from the atmosphere; or making an organism that produces an enzyme involved in the industrial production of paint for ship hulls. They both sound reasonably helpful and there might be a commercial market for each one, but I think most people would say bioluminescence is a bit cooler than ship hulls. But why should science be designing things on the basis of them being cool, or fun, or exciting?
The way the iGEM competition tends to work is to prioritise and celebrate projects not only for their scientific success but for their fun spirit. I am not trying to make a case for or against the notion of fun and excitement as a relevant factor in the judging process or indeed in science more generally. It is just to begin to point out that there are dimensions to the choices that iGEM teams make and the decisions that judges make that are not simply objective. Instead, making choices of this kind involves a range of values and emotions that we tend not to see, and that we often erase from our descriptions of why we chose certain projects over others.
In this regard, I want to remind iGEM teams that the ways in which they choose their projects are laden with values and social features of everyday life that aren’t captured by the usual assumptions about how science and innovation progress.
Indeed, iGEM isn’t all about fun and playfulness. In fact, the fun and playfulness are part of a larger issue: iGEM is often more about demonstrating that synthetic biology as a field is useful itself.
Being useful, being industrially-relevant, solving a problem and so on: these are some of the values that engineers often prize in their work and they have become central to synthetic biology. And perhaps these seem obvious and uncontested.
But in order for synthetic biology to be funded and to distinguish itself from previous forms of genetic engineering, it has had to organise itself directly in relation to making stuff that’s useful to industrial partners.
And hopefully here we can see that there are certain ethical implications. If we just focus on the ideas of being useful and industrially-relevant there are a number of questions we can pose. To whom will the work be useful? To what uses will they put it? Will it be used solely in the ways intended? Why should we make something that is industry relevant in the first place? And which industries do we prioritise? What are the values of those industries? What do they do in the world and with what implications? How do the values of these industries and companies align with your own values?
In this regard it becomes important to think about the specifics of the project you’re considering working on and whether the context of the industry in which you plan to work makes a difference to how the object you produce will be used. Is making something useful for the pharmaceutical industry just the same as making something useful for the agricultural industry, or for the weapons industry, or the space industry?
The common assumption that iGEM teams should make something that is useful is generally attached to making something industrially-relevant. But this embeds a certain relationship between science and industry into the process of choosing a project. Generally, it puts science in the service of industry.
That relationship is not necessary and doesn’t have to be part of how science and engineering work. However, it has become a background assumption in iGEM. It has been embedded into iGEM’s emphasis on demonstrating the relevance and usefulness of synthetic biology. And this plays out in team’s choices.
So when you’re thinking about what to focus on for your project, consider why you want to make something for industrial use. What kinds of industry do you want to create things for? Think about how they might use it. Who will gain from this technology and who will lose?
It can be easy to get lost in making your iGEM choices, particularly if there are lots of options and you’re excited about a lot of different ideas. It is of course part of iGEM that you should have fun, but choosing your iGEM project has to be an ethical choice and it is one that you should make explicitly, that you should think about carefully, and that you should talk about with a range of different people before settling on an idea.