Create. Share. Engage.
Portfolios for learning and more brought to you by the Mahara team at Catalyst IT. Host Kristina Hoeppner talks with portfolio practitioners, researchers, learning designers, students, and others about their portfolio story.
Create. Share. Engage.
Brian Williams: CIA and portfolios
Brian Williams, BTech, is Security Operations Supervisor at Catalyst IT shares why information security is an important consideration for people creating and working with portfolios. He explains the acronym CIA (confidentiality, integrity, availability) and highlights a few stories that illustrate why data privacy needs to be a fundamental consideration for portfolio creators.
Connect with Brian on LinkedIn
Resources
- Mahara
- AAEEBL Digital Ethics in ePortfolios Principles
- GDPR
- What is a SOC?
- The CIA triad: Definition, components, and examples
Subscribe to the monthly newsletter about Mahara and portfolios.
Production information
Production: Catalyst IT
Host: Kristina Hoeppner
Artwork: Evonne Cheung
Music: The Mahara tune by Josh Woodward
Kristina Hoeppner 00:05
Welcome to 'Create. Share. Engage.' This is the podcast about portfolios for learning and more for educators, learning designers, and managers keen on integrating portfolios with their education and professional development practices. 'Create. Share. Engage.' is brought to you by the Mahara team at Catalyst IT. My name is Kristina Hoeppner.
Kristina Hoeppner 00:28
Today I'm speaking with Brian Williams, our Security Operations Centre Supervisor here at Catalyst IT. Developing a software product, the Mahara team works very closely with our information security team to ensure that we follow good practice because electronic portfolios live primarily on the internet these days, whether they are publicly accessible showcase portfolios or privately held learning portfolios to which only the respective portfolio authors have access, they are online. Therefore, those who create portfolios and support learners and staff to keep their learning evidence and reflections need to be safe in storing their content. That's where information security comes in. Today's episode will focus on that important part of using portfolio technology. So thank you, Brian, for having a chat with me today.
Brian Williams 01:20
Hey Kristina, thanks for having me.
Kristina Hoeppner 01:21
Brian, you are our SOC Supervisor. So not the sock with the 'k' at the end, but just the Security Operations Centre - SOC. How did you actually get interested in that field?
Brian Williams 01:33
So it's not the most interesting story, but back in the day in the 90s, I was around about 10 years old and watching 'The Matrix' and I thought Neo, I looked up to at the time, and he was a hacker, I thought "Brilliant. I'm going to do what Neo does and become a some sort of a hacker or get into security and sort of stuff." And yeah, it just went from there. After a while I did start to find out that information security and SOC operations wasn't quite like the movie 'The Matrix', but by that time, I was pretty into it.
Brian Williams 02:02
So I went into IT and then eventually I joined the military. They're actually standing up a Security Operations Centre. So I had the opportunity to help build that capability out in the New Zealand Army. And yeah, pretty much went from there, and now I'm currently working for Catalyst and their Security Operations Centre.
Kristina Hoeppner 02:22
Great that watching a movie was not a rude awakening when you then went into your career choice and that even after so many years, you're still interested in the topic and help us be safe in all of our operations.
Brian Williams 02:35
I think it just goes to show how good Keanu Reeves' performance was in that movie [Kristina laughs].
Kristina Hoeppner 02:41
So if you were to describe the goal of information security briefly to our audience who may or may not have watched 'The Matrix', what would you say?
Brian Williams 02:50
The goal of information security is actually one of the things which is pretty well defined in information security. Just keep in mind that whether or not you call it information security or data integrity or defensive computer operations or whatever, it's a new industry. It's only been around for a couple of decades, at least in the civilian sector. Obviously, in the military aspects, it's been around for some time. So there's a lot of you know, terms, and we're still finding our feet when it comes to the way that we define things, and it's an ever evolving media.
Brian Williams 03:22
But the thing, which really it boils down to is what we call the CIA. That is an acronym for confidentiality, integrity, and availability. So CIA, obviously not the CIA in America that you often might see in the movies. Yeah, so confidentiality means that, 'Hey, you've got your information, and only the people who ought to be able to see it should be able to see it.' If I've got files on my computer, I want to make sure that you can just hop onto my computer and start looking at those. That's confidentiality, and it's probably what relates to portfolios more than any of the other two.
Brian Williams 03:56
But also I'll just go into the other two, and that is integrity. You want to make sure that the information that you're looking at is as true and to be trusted and everything. That just means that hey, if I say I create a CV in an ePortfolio, someone can't come along and make changes to that. It sort of touches on the confidentiality aspect, but yeah, so making sure that the data we're actually looking at is, you know, legitimate and hasn't been manipulated or changed in any ways. You know, if you've got on your CV that 'Hey, you worked really hard and got yourself a degree or something like that, you don't want someone coming along and changing that.
Brian Williams 04:29
And the availability, it's all well and good to make sure your data is correct, and it's confidential, but it needs to be available for when we need it. If we have a piece of paper with all the information that we need, and we lock it behind three layers of vaults and bury it underground. Well, yep, sure, it's pretty confidential. No one's going to be able to find it. The integrity is there. It's not going to be able to change, but is it available for use? I think that's really important when it comes to the ePortfolio side because you've put a lot of time and love into creating these portfolios, and if it's not accessible or available for either yourself or who you want to be able to see these things, then I feel like there's one of the fundamentals of portfolios to be able to share them and show them off. Yeah, so they need to be available. So those are kind of the three core aspects, and that relates not just portfolios, but pretty much all of the information security, whether it be like say, banking or governments, any time that you've got information which you want to be secure.
Kristina Hoeppner 05:29
That's a neat acronym there - CIA - confidentiality, integrity, and availability, and really nicely also aligns with the AAEEBL Digital Ethics Task Force Principles that we have set up for digital ethics. In particular, it relates to the data responsibility principle because there we talk about advocating for secure location and storage of the data, which goes to the integrity and confidentiality of your data, adequate privacy policies, which is confidentiality and also availability so that learners can decide with whom they want to share their portfolio content and not just have it automatically publicly available to everybody, and also adhering to standards for that data collection.
Kristina Hoeppner 06:15
Brian, you did tell us a few of the positive things why you should be following CIA. Now what could happen if you didn't follow any of those things? What could be some of the detriments of not knowing where your data goes or what privacy policies are where your data is stored? Do you have a couple of examples for that?
Brian Williams 06:39
Yeah, I guess, I can think of a couple. What that comes down to is your data, and I think that for a long time, we haven't actually valued our data as individuals. Companies have valued it incredibly. You look at all the big tech companies, what do they actually deal with? A lot of them don't actually really make products as such, they deal in data, and harvesting that data is obviously a massive part of it. We're just starting to move into a really, really interesting space. So I don't want to go down the AI rabbit hole of what the implications of AI are going to be, but AI needs data. That is the one thing it absolutely needs. An AI without data is like an engine without any fuel. It needs it to really, really work.
Brian Williams 07:21
And the thing is, we put a lot of information out on the internet. Some of it we're absolutely fine. We're aware that it's going to be shared. Hop onto our social medias, and we share information. What's concerning is when we share things without our prior knowledge or understanding that this is actually going to go out there and be shared. For a lot of people, they feel like 'Oh, you know, hey, everything is spying on us, and the horse has already bolted.' But there's a lot of things that we can do to make sure that if we put something on the internet or if we create something, you know, hey, if I write something down on a piece of paper, I know that it's probably not going to end up on the internet. But the thing is, we're living in an age where there's so many amazing tools out there, there's so many great pieces of technology, which really, really help us, they make our lives a lot easier. But we need to be clear about some of the risks which come from those and also the ways to actually mitigate those risks.
Brian Williams 08:16
So some of the things that we can be aware of what are they doing with that information? Do they have any kind of like security policies as to 'Hey, look, we host your information, we're not going to pass it on to third party providers.' Have they got any kind of code of ethics or anything like that? Because, yeah, sure, someone might say, 'Oh, I don't really care if my work ends up getting leaked.' And yeah, I can understand for certain pieces of information, we will actually actively publish them, CVs and whatnot, but portfolios and also just digital work in general, there needs to be the presumption of privacy.
Brian Williams 08:50
We can't just give up privacy and just go 'Oh, well, we no longer live in a world where privacy is a thing.' That's just, to me, unacceptable because I feel like it stifles our creativity. If I put a microphone up to your face everywhere you go and say "I'm gonna listen to everything" that will stifle what you say, and I feel like that comes back to anything that we put on our computer as well. Sometimes you don't want your work to be seen or to be published. And because if you feel that you can trust your platform or trust what you're using, then you're going to be a lot more willing to be, I guess, creative in different ways as opposed to worrying about what people might say or, you know, seeing it on the internet.
Kristina Hoeppner 09:31
The argument that I do hear often in that context is 'I don't have anything to hide, so it is okay for me.' However, that argument doesn't take into account that that is not the case for everybody, that there are vulnerable communities or vulnerable individuals who do need to keep their data private or want to keep their data private and not have everything really available because sometimes it is scary what certain programs know about you simply because you are being tracked around the internet.
Brian Williams 09:59
I completely agree. You shouldn't have to justify privacy. You shouldn't have to turn around and say, 'Oh, look, you know, I want to be private because of these reasons.' It's just like, no, no, no, that's not how it works. Privacy should be implicit. You get to choose, or you ought to be able to choose what you share. I absolutely acknowledge that some people will say, 'I've got nothing to hide.' Wonderful, then share everything. But it's not about hiding things. It's about what you want to be shared. And that, I think, is a really, really important principle for the future, which people need to hold on to because it's not necessarily that, you know, once the cat's out of the bag, it can never go back. We're finally starting to see governments and legislation starting to catch up to these, you know, big tech companies, and I'm not here to bash big tech or anything like that in any way. You know, we all use big tech, but they didn't become the monoliths that they are by providing you with a free service.
Kristina Hoeppner 10:51
You did give up a lot of your privacy because they are just harvesting the data.
Brian Williams 10:55
Yeah, exactly. You know, people need to be aware of what they're giving up, and if they don't care at the moment, hey, no problem. But, you know, future them may care. There's always steps that you can take to be a little bit more cautious on the internet and stuff.
Brian Williams 11:09
One thing I do actually want to just touch on is that it's from, I guess, platforms asides and also groups or institutions, which provide these kinds of platforms, I feel like regulation is actually starting to come or at least some sort of certifications or accreditations is on its way. There's been so many times, and I'm not specifically, sorry, talking about ePortfolio platforms, but the amount of times that we hear over the last, you know, say five to 10 years, 'Oh, XYZ company has lost all your data in some kind of a breach' and everyone goes, 'Ah, well, that's annoying.' And the company generally goes, 'Yeah, sorry about that. Moving on.'
Brian Williams 11:43
Well, the thing is, if you went to a bank and deposited your money, and the bank got robbed, and I said, 'Yes, sorry about that. We got robbed. All your money's gone.' You'd be like 'That's not acceptable. I entrusted you with that money.' And quite rightly, when we go to a bank, and they get robbed, we say, 'Oh, okay, well, you know, that sucks for you. But my money is still mine I deposited with you, I expect to still be able to get it out.' With data, because it's our information, it's personalised to us, when that information is gone, it's not like it can be returned. And they go, 'Oh, yep, sorry, we lost that. However, we've managed to get it back.' That's kind of gone.
Brian Williams 12:18
Because over the last couple of decades, we've just seen a really lack of consideration by companies to do this because it costs money. You know, securing data costs money, and it is an attractive asset. People want to steal it. Companies have not been particularly good, and governments are finally starting to catch on. For example, we had the GDPR come in, over in the EU, which you know, puts a lot more emphasis on the holders of our information to make sure that they're collecting what they need to collect. Because the whole 'Just trust me, bro' thing, it doesn't work any more. And we've seen it time and time and time again that it just doesn't work. Companies gonna try and make money. That's fair enough, but as individuals, as citizens, as governments, we have a right to feel safe when we provide that information.
Kristina Hoeppner 13:04
To make it a bit more concrete to the general information that you've provided, Brian, is that a few years ago, all the URLs had 'http' at the front, so 'http://' and then whatever the domain was. But increasingly, we have seen that an 's' was added at the end. What does that actually mean for us people using those websites? Why is that 's' in https so important?
Brian Williams 13:31
So the 's' is important because that ensures that your information is going to be encrypted. If I put in my username and password when it was just HTTP, there's a very, very simple kind of like a man in the middle attack, I guess you could say.
Kristina Hoeppner 13:43
So somebody essentially can read your passwords. It's like sending your password via a postcard.
Brian Williams 13:48
What a brilliant analogy. I'll use that one. Someone send you a postcard, you know, any set of hands that it goes through, anyone can just read that along the way. Whilst ISPs generally can be trusted, a lot of times your information is routed through countries or companies or places where we have no control over. So obviously, we're here in New Zealand. If our information is stored in New Zealand, you know, we have a little bit more sovereignty over it. We've got data sovereignty. However, if it's in the States, it can be routed through all sorts of different locations, and some of those countries - hey, international relations, we may not have the best kind of relations with it. There's also not very nice people on the internet sometimes. And so yeah, the 's' allows that information to be encrypted. And then when it is received at the other end, it is then decrypted so they can validate your username and password.
Brian Williams 14:38
And that's something which is fundamental. In this day and age, you using a website which does not have https, you know or the little lock and everything on the side, then you really need to be thinking, okay, firstly, you shouldn't be posting anything to it. You shouldn't be typing anything into it. Yeah, a lot of all websites might have that and they're probably okay to click around on but you do really want consider that anything which doesn't have that 's' can be read by third parties. And that's something important to keep in mind.
Kristina Hoeppner 15:08
The nice thing is that we who are using these websites don't have to do anything. So we don't need to install special security software, but all of that happens on the background.
Brian Williams 15:19
Yeah, that's correct. That kind of goes into something I'd like to touch on, it's about trust. At the end of the day, the everyday user of the internet or ePortfolios, they shouldn't have to go get some sort of an IT degree or, you know, do training in all these sorts of things. They need to be able to trust their platforms, just like every time I hop in a car, I don't need to have a mechanical engineering degree. I just need to know that this was manufactured by people who have a good standing. If I find out that my mate down the road, built it, and I also know that he's not a particularly good engineer, I may not trust that. So it is really important to kind of keep in mind that the platforms that we use, there's always going to be an element of trust, but we get to choose who we trust, or we should at least have a think about who we trust.
Brian Williams 16:03
So if you're a student of ePortfolios, then inherently, you have to trust the person who's provided you with that system that they've done their due diligence. Because it's unlikely that they have actually created the platform themselves, and then there's an element of that they need to trust the platform provider, you know, who's actually sat down and actually built that. Trust doesn't have to be faith. It doesn't need to be blind trust or anything like that. People need to do their due diligence, especially those organisations who provide these platforms for their students. They need to go and make sure you know, really, really important things of like, well, how long have they been around for? What is their reputation? Does anyone else actually use these products? All the way through to okay, well, how often do they patch? You know, we might have heard about some important security vulnerability. How quickly did they get in touch with us to say, 'Hey, there is a critical update, which needs to be put through'? Are they in touch with us at all? Or did we just download it and use it? When was the last time it was updated? If it was updated 10 years ago, then that piece of software might have been abandoned and is no longer, you know, fit for purpose.
Kristina Hoeppner 17:08
It has quite a bit to do with transparency so that say, a university that makes a portfolio platform available to all staff and students can actually provide that information if somebody wanted to dig deeper or that students or staff can see, yes, our organisation has done its due diligence. This safe for us to use.
Brian Williams 17:30
Yeah, and you know, I mean, open source is an amazing space for that sort of thing. With open source, you can actually sit down and interrogate the code. Not every institution is going to have the ability to go through it or the fact that some definitely will and it is constantly being interrogated means that if there are any faults because at the end of the day, the people who write programs are humans, and humans make mistakes. But it's often about how quickly those mistakes can be identified and also remedied. The more amount of eyes that you've got on a product, then generally, the quicker these things can actually be rectified. And exactly like you say, transparency. The 'Just trust me, bro' mindset towards these sorts of things, like, 'Oh, I trust that this app isn't listening in on me.' Those sorts of things, you do have to trust some people eventually, but it's a lot easier to trust someone when you've got true transparency in place.
Kristina Hoeppner 18:21
I think it also means that if you do come across an issue that you think might be a problem that you do also report it to make sure that the organisation that provided you with a platform, be that the university or if you've directly signed up with a provider, that they can look into that problem and determine whether it's actually a problem or not, and that's where people like you and the rest of our security team come in who can help with those things.
Brian Williams 18:51
Yeah, exactly. And that's something that, you know, obviously, our team assists with in our company. So if someone turns around and says, 'Hey, I think I've found something,' which you know, might be mistake and stuff. And we can then, you know, investigate that and get back to them that two-way communication is incredibly important, I believe. It's nothing worse than finding some sort of vulnerability and then feel like your're just screaming into the void.
Kristina Hoeppner 19:12
This was quite a scary episode, I must say, so far [laughs], with all the possible potential problems, but what I do also like is that you pointed us to some strategies that can help students, educators, and also organisations so that you do not need to get this information security degree or constantly be in touch with the security team at your own organisation. And oftentimes, I think, really also employ some common sense and not trust too easily, but take a breath, look at what you're actually using, but also then knowing some of those basics in particular around encryption, the security protocol, the https.
Brian Williams 19:53
Actually, I just really want to touch on the security part. Sorry, when I say the security part, I mean, it can seem like you know, doom and gloom, you know, the internet is out to get you. But I really want to touch on the fact that as humans, we have a tendency to see things as a zero sum game, or, you know, throw out the baby with the bathwater sort of thing. Like, I remember when internet banking was first starting, it was quite strange because up to those sort of times the internet was always seen as a little bit of a dodgy space. You know, you're always told, 'Don't put your credit card details in anywhere. Be careful what you post. You don't know who you're going to be talking to on the internet. Don't trust what you see on the internet.' So when the banks started providing internet banking, it had to be a little bit of a mind shift to all of a sudden, you know, you can access all of your money via the internet, as opposed to these walk into a bank with all the security and all the vaults and all those sorts of things.
Brian Williams 20:43
Now, thing is though round about that time, so a lot of people definitely a little bit older than me, they turn around and go, 'No, no, I don't do internet banking.' And you know, that was a blanket sort of thing, 'No, I don't like the internet, I don't trust it, and therefore I don't trust the internet banking.' That's well and good. That's their decision to make. However, things move on in terms of how we work as a society. A lot of people hopped onto internet banking, therefore, a lot of branches started to shut down. And what that meant was that for the people who are like, 'Oh, no, no, I don't do internet banking,' life became harder, harder and harder in order to do what was relatively normal. If the only way that you can pay your bills is to go down to the bank and get out the money and those branches are now closed down, that's difficult.
Brian Williams 21:28
And the point I guess I'm trying to make is that you don't want to be left behind. We can't stop using technology in general because we're worried about some of the privacy implications. What I feel like people need to do is generally not just get overwhelmed with the scare factor or anything like that. Just go, okay, look, we can't give up that ground. What we need to do is be aware of the risks of when we're using new technology. And once we understand the risks, we can either mitigate them or accept them on our own terms. So that's what we're talking about trust. There's always going to be risks. We do have to trust some things. But understanding the risks and demanding more of those providers is really, really key.
Kristina Hoeppner 22:09
I think the key in there, what you've just said is don't just trust the provider, but also engage with them in order to make things better. Query them, talk to them, because in a way, I find that if nobody uses the software, of course, nobody's going to put any money in to make it better in order to make changes. Whereas if more and more people are using it, and then post those questions, 'What about data security? What about that sovereignty? Can I keep things private?' and all of that suddenly, when organisations see there is actually a demand in improving things, making things more secure, then it will happen.
Brian Williams 22:47
I completely agree. Demand those features from the products and hold platforms to account. Demand the tooling, demand being able to have the ability to customise, say, who you share the information with. All of these things are very, very important. And yeah, have a good relationship with the people that you buy the software off.
Kristina Hoeppner 23:06
I feel also, especially in our open source space, really work together as a community and see how we can make things work because of course, we are working in so many jurisdictions that there's hardly ever a one size fits all approach. But we do need to know what is important for people so that we can make those incremental changes to support them wherever they are.
Brian Williams 23:28
Yep, absolutely. That comes back to the relationship that you have with your providers. And I'll be honest, I do find open source just as a philosophy, generally a lot more welcoming towards those sorts of feedback and change. You don't make open source software unless you generally kind of believe in it. And that philosophy generally comes with a lot more backwards and forwards and having a good relationship with your end users.
Kristina Hoeppner 23:52
So Brian, we're getting towards the end of our session today already, and therefore three quick questions for you. What are three words that you associate with portfolio work?
Brian Williams 24:05
Portfolio work, three words, I'd probably say 'learning', 'reflecting', and 'revisiting'. The learning side absolutely. Reflection, the amount of times I've written something down on a piece of paper, but not actually thought about what it meant or the context or actually kind of reflected about what I've just, you know, learned and even worse, if you never look at that piece of paper again, you never get the revisiting. I wish we had portfolios or the ePortfolios back at school because I feel like they're a wonderful way to document your learning, but also to go back to it at later.
Kristina Hoeppner 24:38
Now what tip do you have for learning designers or educators in general who create portfolio activities?
Brian Williams 24:46
Just from a normal day to day point of view, I would say and this is easier said than done. Pick a good platform. I would say try it yourself as well. If you're gonna get your students to use it, try using it yourself. What are the limitations of it? Is it fun to use? And then try and think to yourself, 'Okay, what other potentially other options are available?' And from a security point of view, I would always say that what we've already discussed today. Ask those questions of your providers. Where is our data being stored? Is it being passed on to third parties who might want to mine it for ads or anything like that? I would ask those. 'Hey, have you got a privacy policy?' Something as simple as that before we go any further. Just the fact that you're asking that question, will pass on really important feedback to the sellers of the software that we care about our privacy.
Kristina Hoeppner 25:36
Mhh. And what tip do you have for people creating their own portfolios?
Brian Williams 25:41
Personally, I would say, create the kind of portfolio that you'd want to read. That's really important because there's no point - a reflection is really important, but in six months' time, you say you want to go back and revisit the information or reflect on it again, or just go, 'Hey, what was that I learned?' If you've created a portfolio, which is not particularly great, then you're not going to want to read it and also trying to show it to other people, they're not going to want to read it. So create a portfolio that you would want to read, and however you go about that is entirely up to you. But you know, get creative with these sorts of things.
Kristina Hoeppner 26:16
Thank you so much for this awesome advice. Because yes, the portfolio is the one that we create for ourselves. Yes, there is an audience, but it is our product, and therefore it should reflect who we are.
Kristina Hoeppner 26:29
Thank you very much, Brian, for this conversation with you this morning in looking a bit into information security, data privacy, and so on and why they are important. I really appreciate your time.
Brian Williams 26:43
Awesome. Cheers, Kristina. Thanks for having me.
Kristina Hoeppner 26:45
Now over to our listeners. What do you want to try in your own portfolio practice? This was 'Create. Share. Engage.' with Brian Williams. Head to our website podcast.mahara.org where you can find resources and the transcript for this episode.
Kristina Hoeppner 27:03
This podcast is produced by Catalyst IT, and I'm your host Kristina Hoeppner, Project Lead and Product Manager of the portfolio platform Mahara. Our next episode will air in two weeks. I hope you'll listen again and also tell a colleague about it so they can subscribe. Until then, create, share, and engage.