107 – Where Do We Draw the Line on Transparency?

The IDEMS Podcast
The IDEMS Podcast
107 – Where Do We Draw the Line on Transparency?
Loading
/

Description

In this episode, Lily and David Stern consider IDEMS’ relationship to transparency. They discuss the challenges and ethical considerations in sharing data, and the importance of balancing transparency with the need for privacy and protection against potential harm.

[00:00:00] Lily: Hello and welcome to the IDEMS podcast. I’m Lily Clements, a Data Scientist, and I’m here with David Stern, a founding director of IDEMS. Hi David.

[00:00:14] David: Hi Lily. Looking forward to another discussion. What are we discussing today?

[00:00:19] Lily: I thought today we could discuss where that line is in transparency.

[00:00:23] David: Oh, I’ve no idea. It’s a really good question. And it’s one which we struggle with quite a lot in a number of different ways. Do you have a way you’d like to approach this? Otherwise I have some concrete examples which I find really challenging.

[00:00:40] Lily: I’d love to hear the examples because I’m sure that your ones are so much more expanded than my ones. I guess for me, working on the data side I guess it comes from that very simple example of data. I can’t share data. I can’t share data within it identifying information. It might be that I can’t share just any data with people, depending on the kind of contract and the agreements in place. But at IDEMS we very much, I think I want to say pride ourselves on being transparent.

[00:01:11] David: Value, I would argue. We value being transparent.

[00:01:14] Lily: I notice it’s not one of the principles, but it’s embedded in the principles in terms of, open by default.

[00:01:22] David: Yes. It’s one of those I think there is a principle of transparent accounting somewhere, but it’s not one of our organisational principles. There’s more principles, there’s staffing principles, there’s principles all over the place. And the point, one of the really difficult ones with transparency is exactly as you’ve said, that there’s good reason why you shouldn’t make everything transparent.

And personal identifiable data, you could be putting people at risk or at harm by sharing that, even if your intentions are good. There is need for privacy, and this is really important. And that line between where the need for privacy should kick in, even if we value transparency is really tricky.

One of the ones which was really tough right at the beginning about salaries.

[00:02:27] Lily: I know that this is a discussion that’s also been had at team meetings. But yes, go ahead.

[00:02:32] David: How transparent should you be with people’s salaries? It’s a really difficult one because this is private and personal information. But if you’re not careful on this in the ways that you do it, then you can get situations where, as happens in many contexts, certain people are more aggressive in bargaining or related to their work and their salaries, and that leads to inequalities and often these relate to biases, be they gender or other.

[00:03:12] Lily: Yeah.

[00:03:13] David: And so being transparent there is an important defence, I would argue, against some of those biases, that actually having an element of transparency is really important. And so we use a very transparent pay scale, which is the Oxford University pay scale, and we try not to make big deals out of either sharing things, being transparent, or keeping things private and being private about them.

I don’t have a good answer because I understand how privacy is important related to salaries and things like this, but I also understand the value of transparency. It’s a tricky one.

[00:04:04] Lily: Yeah, I suppose because in this case, it’s not about IDEMS. It’s about you being transparent about other people in a way.

[00:04:14] David: Me as an individual, I’m happy to be transparent for myself.

[00:04:18] Lily: Yes.

[00:04:18] David: But I’m also making decisions which aren’t just about me.

[00:04:22] Lily: Yeah.

[00:04:22] David: And so I don’t want to decide for others that level of transparency. So yeah, these are difficult questions, what you’re deciding for yourself or for others. I think this comes back to some of your questions about the data, that if the data relates to other people, then we shouldn’t be making decisions of transparency necessarily on their behalf, especially if there’s any risk that this could do harm. And so that responsibility of privacy would overrule any desire for transparency.

[00:05:03] Lily: Yes. I just want to add, the data very often relates to other people, but it’s anonymous. So if you can identify the other people, then that’s where the…

[00:05:13] David: But even that, so this is where open data is something which I generally believe in and think is good, but there are limits to it. And there are ways in which open data, even anonymised, can do harm to those whose data it is. So that responsibility for ensuring that the data cannot be used against the people who we’re trying to serve, that is a responsibility that I would argue is more important than our desire for transparency or openness in this context.

[00:05:54] Lily: And where? Oh.

[00:05:57] David: Do you want me to elaborate on that for a second?

[00:05:58] Lily: Yeah, I do.

[00:05:59] David: So let’s say, for example, that there’s a community that we’re working with where the information that we have, if it was made available to commercial partners, might mean that they get exploited in some way.

[00:06:15] Lily: I see.

[00:06:16] David: You might leave them vulnerable to exploitation in some way. And this is something where actually, it’s surprisingly common or easy. There’s all sorts of scenarios whereby these things are happening and data is being used in ways which are disadvantaging communities who are the source of that data. And so, being careful with data to ensure that we are confident that we’re not the source of harm is a heavy responsibility.

Now I’m quite relieved that very little data that we work with regularly are things where I would be really worried about that. If you’ve got health data, even anonymised, at scale, that changes, this is something which could have serious applications and uses in different ways and so on.

A lot of our data that we work with, is related to specific projects with relatively narrow objectives and uses. So, at this point in time, we’re not often having to make difficult decisions on this. But these are decisions that we are thinking about. And I’m not saying that we will always get them right. In fact, I don’t think anyone can always get them right, but I am saying that actually being aware of the responsibilities and the value of transparency, but also the problems that come, that it can lead to.

There’s no easy answer here, and I think that’s one of the most important lessons I’ve learned over time with transparency and even just openness. Which is of course, a form of transparency, which is very precisely defined, related to code and data and a number of other things.

And that’s one of the reasons for our open by default approach, where we don’t want to push things to be open, where there’s good reason for them not to be. Life is never simple. So it’s not as simple as saying open is good.

[00:08:46] Lily: Yeah.

[00:08:47] David: Open is a good default position. That’s something that I believe. But there are good reasons that certain things should not be open. And I think the transparency, I would argue, is relatively similar. That broadly our position is by default, we try to be transparent.

[00:09:08] Lily: It sounds like that the kind of line, you’d rather have that kind of protection of others than that transparency. You’re transparent about why it’s transparent and you’re transparent about why it’s not transparent.

[00:09:25] David: No, I don’t think even that’s necessarily true. Just a very simple element about privacy and why this is so important in some of the work that we do. In some of the app work, when you have work which relates to intimate partner violence, then privacy can be essential. And thinking about privacy even for things which seem harmless or don’t need to be private can be essential.

And so thinking through and understanding certain cases where transparency even of simple things can do harm is so important.

[00:10:14] Lily: But here you’re transparent about why you’re not allowing that data to be out there.

[00:10:25] David: Transparent to who? The whole point is that you don’t want a potentially violent partner to be knowing that there’s something that they don’t know.

[00:10:35] Lily: Okay.

[00:10:36] David: What do you mean transparent? This is not, this isn’t about transparency. Transparency is not, in that context, it’s not important compared to priorities which are more important. And so I would argue that this is where the by default expression, which we use for our open in our principles, open by default, is so important. You don’t need to be…

It’s also the same with the principles in general. You don’t have to follow the principles. If there’s good reasons not to, that’s fine. And there are good reasons why in certain cases, transparency is not what should be worried about or considered.

It’s one of the things that I find so difficult in general with the way as a society we’ve tended to move towards simple positions of good and bad rather than actually accepting and understanding forms of complexity. And transparency is one of these really clear cases where complexity is inherent. If you don’t have transparency, a lot of bad things can happen.

But if you do have total transparency, a lot of bad things can happen too. Understanding the complexity of when to be transparent, when not to be, what to share, how to share, there’s no easy answers here. I don’t have any particular insights other than it’s complex. And therefore it needs to be treated with thought and care.

[00:12:30] Lily: Yeah. And I guess, this conversation is really useful because it’s helping to see, well, there’s depth in everything, of course there is.

[00:12:40] David: Of course.

[00:12:41] Lily: Everything’s very well thought out, particularly with you and with IDEMS, you and Danny and Kate. But seeing that depth in transparency, I guess you take it for granted. Where I am, at the data level up here, I guess I’m more told, not told, but I’m more, yeah, I’m more told. Okay, we give this data to these people, but it has to be anonymous. Great. We do not give this data to these people. Okay. Whereas thought has gone into the hows and the whys and the, okay, should this be, should this not be?

[00:13:15] David: And this is why actually these sorts of episodes are so useful in general for us as a team internally, because this is something where being able to have these discussions and actually see that we don’t necessarily know when we make those decisions, but we do put thought in.

And, you know, there will be a point in your career, which is pretty soon, where you would probably switch from actually being, as you say, not putting the thought in, but accepting the decisions, to actually being the person who has to put the thought in to be able to make those decisions and to be able to not just… I’m never comfortable making some of these decisions, so it’s not necessarily about being comfortable making those decisions, but being thoughtful about them and recognising when thought adds value.

One of the things which I find very challenging and I hope as we build the sort of culture within IDEMS, that this thoughtfulness should be central, and more central than rules. One of the things which then, often happens, is that when people aren’t thoughtful, what you need to do is you get laws and you get rules, which then take the place of thoughtfulness. People have to be very thoughtful to build good rules and so on. But it means that to be compliant you’ve got to just tick the boxes.

[00:14:47] Lily: Yeah.

[00:14:48] David: And you don’t need to be thoughtful. I understand why that approach is the approach that our society is taking much more. And I’m not saying it’s a bad approach for society, but I am saying that within IDEMS the culture that we want to build is one of thoughtfulness, not one of sort of compliance.

So let me be really clear. I want to be, we need to be compliant, of course.

[00:15:16] Lily: Yeah.

[00:15:16] David: But we don’t want to just tick the box and be compliant and job done. We want to understand, have we gone far enough? Not just are we compliant, why is this important? In what context might we need to sort of take these ideas and apply them elsewhere where there isn’t compliance, there aren’t the same compliancy issues.

And so actually that thoughtfulness is really critical to, I believe, the culture we need to create. And it’s hard because most of the time the conclusion is, I don’t really know. So you have to be comfortable with that uncertainty and that conclusion that we don’t really know, but it’s still worth being thoughtful about it.

[00:16:06] Lily: Comfortable with uncertainty is a way of putting it, which I guess is, mathematical scientists isn’t something that comes naturally.

[00:16:16] David: No, it doesn’t. I love certainty. I wish I had some in my life right now.

[00:16:23] Lily: But then it’ll be interesting to speak to Kate, because I guess saying, as mathematical scientists, we’re not comfortable with uncertainty. Kate isn’t a mathematical scientist, how does she feel about uncertainties? Is it something that comes more naturally or is it..?

[00:16:38] David: Of course, Lucie as an anthropologist, has been trained to enjoy the messiness, as she likes to call it on occasion.

[00:16:48] Lily: Yeah.

[00:16:48] David: This is something where a lot of richness of life comes from that. And anthropologists are very aware of this and conscious. But I do think that although mathematical scientists are really trained to look for and to seek out certainty and really feel discomfort with uncertainty acutely, I don’t think there’s that many people who are actually comfortable with uncertainty.

I think I might have told you about this training, it wasn’t a training, it was a talk I went to by the uncertainty experts, which really resonated with me as something where I was exposed to the fact that there are studies about this and how the importance of being able to deal with uncertainty and the strategies to do so actually are valuable. And there’s elements of this which I have developed over the years for myself, which then align with what the research is showing around uncertainty as well.

I guess the uncertainty issues are quite a long way from transparency, which is where we wanted to start.

[00:18:05] Lily: Yeah, sorry, it’s a conversation for another day, but really interesting to know that there’s a lot of thought in that area.

[00:18:13] David: Yeah, and I think to get back to the transparency and relate it to this uncertainty issue, it is the fact that even if you’re well intentioned, because there’s so much uncertainty about how others could use it, could use things that you are sharing transparently, that requires you to think, to be aware of how could this be used in ways which if you didn’t have good intentions, that sort of thinking has to be something which you develop. Which of course goes against the sort of, how you want to act. But if you want to think about how to be transparent responsibly, you really have to be able to also think how could you abuse this, if you have access to this, how could this be abused?

[00:19:15] Lily: And that feeds into other principles such as holistic interventions, in terms of thinking through that depth of how else could this be, well, used and abused?

[00:19:29] David: It’s interesting, I haven’t actually made that link to the holistic intervention sort of approach. Because I see transparency very much at the other end of the spectrum from interventions themselves.

[00:19:44] Lily: Okay.

[00:19:45] David: But it’s an interesting question. Maybe that’s something which I haven’t reflected on enough. Because I can see that, of course, if we think about interventions then there are elements where the transparency, if you’re thinking holistically, transparency somewhere has impacts elsewhere. So there probably is a relationship I just haven’t thought through yet. It’s an interesting one to reflect on. And maybe one for another episode at some point to actually dig into the holistic interventions a little bit more with respect to what is known and what is shared.

[00:20:21] Lily: Yeah. But for now the focus is, or was, transparency.

[00:20:29] David: So why did this interest you? Why this topic?

[00:20:31] Lily: It comes from I guess just thinking about IDEMS, I get the impression that, particularly before this conversation, of course I know that there’s depth. Now that we’ve had this conversation, I’m like, oh yeah, of course there’s complexity and depth, and there’s no blanket transparency.

But I guess, before this kind of conversation, it was more, not that I thought that there was blanket anything before this, because I know that there never is of IDEMS. But like, if you ask the question at IDEMS, I guess, assumed that you would give the full answer. And it got me thinking, well, that could be irresponsible, that could cause problems. So then just knowing where that line was between transparency and I guess privacy or how did you put it, that kind of awareness of the danger that could cause. And then just like openness. Being open, does that lead to vulnerabilities?

[00:21:30] David: It can do, absolutely.

[00:21:32] Lily: But openness can also lead to strength because of, you know, open source and then more people can get involved and whatnot. Of course the conversation is that there is complexity and that there’s no one answer.

[00:21:45] David: But I think you have brought up an element of it which I haven’t touched on, which is that a lot of the reasons against transparency are the fear that it will negatively affect you.

[00:22:06] Lily: Yeah.

[00:22:07] David: And that is one where I would argue, this is really important. I’ve been speaking almost against transparency because it might negatively affect others. And that’s something you need to be very careful of. But I would argue that part of where this has come, from you then is the fact that we don’t try to protect ourselves as much as we should, maybe, or could.

We prefer to be transparent and this has negative consequences for IDEMS in different ways, as well as building the strength. And I think part of this is sometimes, being transparent and putting yourself out there in ways which makes you vulnerable. This can have negative consequences, but it can also build strength. And there, as IDEMS, we try to be brave and use it to build strength, rather than protect ourselves in that way.

And I think part of that comes back to the confidence that we have that we should be able to have enough to succeed. We should have enough skills within our team. We should have enough opportunities and so on to succeed that we don’t need to maximise. Quite often if you’re putting yourself out there and you’re making things available that, others can use, then you probably can’t maximise.

Even a very simple one of this is that in the way that we work with partners, we’re very transparent about our costs, our internal workings, compared to other organizations, which may be less transparent in that way. And we do pay a price for that in certain ways, but we also have benefits because it builds the trust between us and our partners. We very rarely would put barriers between us and partners.

Very often for organizations, the organizational barriers, what’s known within the organization and what’s known beyond it can be very different. Whereas for us, we try to keep that barrier very low that our partners and what’s within, what’s outside, the difference is very small. That’s part of, I think, what you were saying about the transparency approach. And yes, we do pay a price for that. But we also believe we get deeper partnerships and longer term collaborations.

[00:25:01] Lily: And the people that we get those partnerships with are going to be people that have similar values?

[00:25:08] David: But that’s not always what we necessarily want. And there is also the acceptance that not everyone will have similar values, and some people will use this against us, and we accept that. And being able to be transparent, even in that knowledge, and accept that you’re doing that, is, I believe, part of what’s needed.

Now, it only works if we succeed. This is the thing. And that’s where those lines, they’re fine lines, they’re difficult. A very simple one is that with our staff and with our partners, there have been times where, you know, I believe some of our partners would have loved to poach some of our staff and to have them just working full time with them, for them, and they could have paid them more than we could do within. And they know that, and they know that because we’re transparent about it.

But that’s not yet happened. If that happens and a partner offers something to a staff member where they would prefer that offer, then I would support it and this is part of building the ecosystem of different organizations working together towards the same goals.

And I think what’s been interesting is that there are other benefits that the people within IDEMS value, which means that even if it was doing the same work within the other organisation, there’s reasons why they would rather do it within IDEMS because of the mix of different things, the way that we approach it, the nature of the organization.

If we can have that transparency, where it’s only because there’s that added value that we’re able to have that transparency, which would enable people to try and make that offer. I would never stop someone, I’d say, yeah, okay, if you want to make the offer to them, make the offer. Because the fact that you’d like to employ them and you could employ them directly and so on, if that’s better for them and it’s better for you, then we would want that to happen.

As an organisation, as well as as individuals, we believe we should be adding value. Now this is a really hard process. And as I say, it’s not something that I think will always work. As we get bigger, I’m sure this will be something where there will be a heavier price to pay for it in different ways, but it also builds the strength.

And so the question is, where’s that balance? So the reason, part of the reason that we choose to be maybe more transparent than most is that we’re pretty confident in what we’re doing and who we are. And so we have that sort of, because of the depth, as you say, it keeps coming back to the depth. It’s the depth of the complications of transparency, but also the depth of the value to be able to be transparent and we’re not vulnerable because we do have the depth.

[00:28:16] Lily: The depth builds that kind of strength.

[00:28:18] David: Yeah. And if people dig, they’re unlikely to find something that they don’t like. That’s where we can be transparent. But more than that, that transparency then holds us to account. And so it enforces that we are able to live up to what we’re trying to do.

[00:28:35] Lily: That’s really interesting.

[00:28:37] David: Transparency is a hard one. It is something which is very important to us. It’s something where both myself and Danny, when we started IDEMS, we had experiences which made us feel very strongly that this was important and that even if there was a price to pay, it would be a price worth paying.

So to come back to where this has come from, yes, as an organization, we do try to be more transparent about more things than would maybe be usual.

[00:29:13] Lily: Thank you very much, David. That’s been very insightful as always and very interesting.

[00:29:17] David: Thank you