User Privacy as a Differentiator with Adam Stone of Secure Digital Solutions

User Privacy as a Differentiator with Adam Stone of Secure Digital Solutions


welcome to constant variables a
podcast where we take a non-technical look at mobile app development
I’m Tim Bornholdt. let’s get nerdy. today we are chatting with Adam Stone of
secure digital solutions Adam is a data privacy and security
executive with more than 20 years of experience implementing and developing
data privacy and security innovations in this episode we talk about the
California consumer privacy act the CCPA and why you should care about user
privacy the principles behind the privacy by design framework and how you
can incorporate that into your app managing third-party dependencies within
your app and we touch on the battle between the FBI and Apple surrounding
end-to-end encryption so without further ado here is my interview with Adam Stone
Adam Stone welcome to the show hi there thanks for having me – I’m really
excited to have you I like we were kind of talking about before the show I III
think we’re both kind of privacy nerds and in the best sense of the word so I
think this is gonna be a fun discussion for for folks that maybe aren’t as as
comfortable with the whole privacy and security angle of app development I
appreciate that and yes indeed I’m a nerd and I’m trying to notify the things
that I have to say about the topic that’s that’s what this is all about is
I like that döner defying – tell us tell us about yourself and tell us about
secure digital solutions and what you guys offer well thanks a lot I am a Twin
Cities based consultant working for a firm in the
Minneapolis area basically st. Louis Park called secure digital solutions and
the firm has been around since nineteen excuse me 2005 started out as a
something more of a traditional IT security consultancy focusing a lot more
on the technical aspects of security but also management aspects of security it
has evolved over several years and we can now say that we are a management
consultancy that focuses on data security and privacy management
operational performance organizational improvement governance compliance and
strategy with that I’ve been with this firm for five and a half years and I am
responsible for our professional services we have a second pillar of the
business which is in fact an app a software app business business software
app that is called trust map and its purpose is to enable information
security leaders to track her to measure and track their performance relative to
information security maturity nice for me I’ve been in the privacy and security
business for a little over 20 years I fell into it by happenstance I do not
have a technology background I did not have it when I entered the business
instead I my background well my type my first degree is and is in philosophy and
I used that degree to jump right into the world of accounting and and I spent
a few years doing that and transition from that to becoming a professional
trainer focused on helping accounting managers CFOs and the like use a
software I designed to help their their business and then around the dot-com
bubble I had fallen into this world of data privacy and security and I haven’t
really looked back I’ve had lots of opportunities to serve across myriad
industries starting in financial services and insurance and branching out
to academia and marketing and manufacturing and pharmaceutical
production distribution and and healthcare and have enjoyed my time
working in and for on behalf all of these different industries have been
happy to be part of the the growth of the privacy industry and the primacy
profession over the past 10 15 years well I mean I think going into it from a
philosophy angle and and moving into privacy I mean there’s so many things
you could touch on on theory and and and kind of I mean there’s a whole thing
we’re gonna talk about with privacy by design where it I can see why it’s good
that you have that full philosophical background because I think it lends
itself well to this industry I – well I would agree it’s you know privacy and
even information security is not a technology subject in my estimation it
is a human subject specifically human malleability human greed and other
negative character traits that lead to our need for a security need to secure
our our information assets but also to preserve the secrets that we share with
folks whether we are sharing these secrets with our best friend or with
some large corporation or or with the government well that that lends nicely
into our first topic of discussion here which which would be around the
government and I want to talk a little bit about the CCPA so I know you know a
lot of our audiences is you know app owners and and product owners and you
know they probably have heard of this acronym know it has something to do with
California and I mean we’re here in Minnesota we don’t need to worry about
that right well I wouldn’t say that California tends to be the leader that
they stand in the vanguard with issues like this and and CCPA is it
no different it’s the the acronym by the way stands for California consumer
privacy act of 2018 but we’ll just call it CCP because that’s easier talk about
it a little bit in like for in terms of as a as an app owner you know what what
about it what why does it exist and and why do why do I need to worry about it
CCPA has behind it a long history of discussion and debate about the role of
these so-called computerized systems or databases which really started
concerning people even as early as the 50s and 60s and it really is just a
natural evolution of these sorts of concerns on behalf of the public both
you know we’re looking at it from a government ownership of our control of
data but also now importantly in the control of data in the hands of private
corporations the law itself emerged out of an activist in California this
individual he turns out to be a very successful business person had a lot of
resources behind him and had agitated for a new data privacy standard and had
effectively put forward a bid to get a referendum on the on the ballot in 2018
that would have created substantial privacy requirements for organizations
if it had passed in response to this threat of referendum the legislature
within within the state of California decided to create a bill to address
these issues and the because the type because of the timing the bill was
rather hastily put together in fact it emerged almost out of nowhere and caught
a caught folks that are not watching California politics closely
really off-guard in essence the the law though in the early days the the media
often characterized the law as something similar to the –use general data
protection regulation or GDP are there are really quite a few differences
between CCPA and GDP are though the basic mantra that
is defined with within GDP are around the providing and preserving rights to
individuals with respect to privacy that maintained but it’s really a it takes a
uniquely American approach to privacy and that is in fact where it diverges
from the GE PR well it’s interesting that there’s these regulations you know
one coming from from GD P R and I I heard you on an earlier podcast kind of
explained like that you know privacy from from the European standpoint is you
know they baked in this tradition of fascism that they have in in Europe and
and and so you know there’s a certain amount of privacy that you would want to
have after going through a regime like that and then there’s kind of the
American entrepreneurial you know look at privacy where well you know someone’s
giving me this data and we can argue willingly or unwillingly and and since
they gave it to me it’s mine and I can do whatever I want with it um I’m
curious to hear what your thoughts are around you know why should why do we
need to worry about privacy and as an app owner I’ve got you know 600 other
things I need to worry about with with my budget and with timelines and with
user experience and with everything else that goes into building an app why why
should I care about the users privacy well there’s there’s all sorts of good
reasons and there’s different angles that that folks take there is the angle
of the the fear of government overreach and that if you enable governments to
maintain all sorts of data about you and to hold your deepest secrets that makes
it all the easier for government to control
you and the population at large can also lead to societal catastrophes such as we
saw in Europe and the in starting in the early 30s and that that really did
create a sea change in the way folks understood the importance of keeping
their secrets to themselves the question around secrets however is an interesting
one because there’s obviously two schools of thought roughly speaking
around the secrets keeping and one is that if you’ve got a secret to keep them
clearly it must be a you know something negative about you that you don’t want
to get out to the public perhaps you’ve done something illegal you don’t want
the authorities to know about it and those folks are tend to be a little bit
more in tagging istic when it comes to privacy and a little bit more cynical
about the the reasons why folks assert the need for privacy the others the
other camps fees privacy as a key component in the the kind of a human
psychological health as well as societal health and naturally the the threat of
physical harm in the event of a breach of one’s privacy is is always present
whether it’s physical or financial or humiliation or discrimination there have
been myriad reasons throughout history why folks feel inclined to hold certain
secrets about themselves while disclosing selectively other information
about themselves to parties that they either trust implicitly or they have to
trust by virtue of circumstances around them my my personal opinion is we’re in
that kind of ladder state where we are you know a position where we effectively
have to trust the organizations that we divulge our secrets to because frankly
there’s to survive in modern society it’s
virtually impossible to get off the grid quote-unquote and with that stated we
have this innate need to trust the individuals with whom we are disclosing
our secrets to and you know sometimes we just have to trust that the companies
that we do business with will do the right thing with our data and only after
those companies breach our trust by some bad act or some piece of negligence or
leading to a data breach for instance that’s when we might lose trust in those
organizations and choose not to divulge any more secrets to them
well that would render a lot of you know what we do as app developers useless as
if we you know a lot of times we’re trying to provide value with that data
and there’s a lot of good reasons to collect that information but I think if
you don’t have good intentions from the get-go in that in that you’re not
thinking about how do you protect this data once it’s been given to you and
treat it with respect and care everyone’s just one hack away or one
step’ of negligence from an employee away from being on the front page of
showing how how bad it is to trust you with your data
they will this that is the reality and an – for better or worse you know we
we’ve we’ve been brought up both most of us in Western society have been brought
up with this notion of sort of the Golden Rule and the variations of that
ideal and part of the golden rules we know is is she do unto others as you
would have done to yourself and that doesn’t seem to always carry through in
commercial relationships and it’s unfortunate but those were the early
days my sense is is that a lot of organizations are recognizing the
business value of identifying trust as a key part of the relationship with its
client base and obviously the the profit motive is a
big part of that I’d like to think that that we are thinking with a bit more of
an ethical lens as we mature around this notion of app development and even just
that though you know what is the Internet and what purpose does it serve
from a social promotion perspective absolutely with with that in mind I mean
if you are so let’s say we are working for an organization and and we do want
to build that trust we see the value in in privacy and protecting that privacy I
know that you’ve talked in the past about a framework that people can follow
called privacy by design maybe we can talk about that a little bit and explain
what that is and how that can actually serve as a way to you know kind of
enforce that the belief that privacy is important from the ground floor when
when we’re building mobile software I really appreciate you bringing this up I
am a huge advocate for privacy by design interestingly this is a topic that is
not a new area of thinking we’re on privacy it is starting Alby –it slowly
to pick up some steam within the app development community and I like to
think of myself as you know one of the folks that really promotes the
implementation of privacy by design in an effort to improve trust between app
developers and their clientele privacy by design or a PVD as folks in the
privacy industry call it is a set of principles that was developed by a
wonderful lady named Anne Kevorkian and Anne at the time was serving as the
information and privacy commissioner for the province of Ontario in Canada she
developed this document around 2009 and at the time it it kind of emerged
quietly but its principles are so salient that it has maintained
and it has really aged like a fine wine in my view the the principles that she
brings forward have not needed any form of enhancement or change they seem to be
a pretty solid set of principles and there’s seven of them and two we I’m
sure that we’ll talk about them and just as we go along here I was gonna say I
what are some of those principles principle one is that app developers
should consider privacy proactively not reactively and it should consider a an
approach that is preventative of one’s private of preventive in that it
prevents exposure of one’s private information versus having to remediate
these circumstances after the fact and so principle one is privacy
excuse me proactive not reactive preventive preventive not remedial
number two is privacy as the default setting and when we get a moment I’d
love to talk about this a little bit more it’s one of my favorite principles
privacy as the default setting foundational principle three is privacy
is embedded into the design and that’s of course very relevant here for
software developers software and a web app developers privacy is embedded into
the design from ideation to sunset of a particular application we’ll talk about
this more in a bit number four is full functionality meaning that the developer
should develop towards a positive sum approach not a zero sum approach meaning
that developers should not create a system that says either you use my
system and you give away some of your privacy or you don’t use my system you
choose that’s a zero sub-game a positive sum game it provides
some balance between the giver and the receiver as it where did the client in
the and the app developer so principle four is full functionality positive sum
not zero sum principle five is anti and security full lifecycle protection and I
think that makes sense inherently we can talk about that in in a little bit this
is where security comes into play we know that security is a key enabler
to privacy in fact without security privacy can’t really exist in the
digital world and so we do need our friends in information security to help
us maintain privacy principle 6 is visibility and transparency keep it open
and in short what that means is don’t create a black box or avoid creating a
black box don’t create a situation where an
individual inputs their their data it goes into some system that is completely
opaque to the end-user and it spits out some data and the individual has no idea
exactly how that data came to be because the the mechanisms the algorithms inside
that software application just are not divulging those secrets and so we’re
promoting visibility and transparency and in principle six the last principle
that is within the pbd is respect for user privacy keeper you keep it user
centric and this really is part and parcel with the movement towards user
centricity in software development and to develop applications that are
maximally usable to the individuals that they’re intended for if you add privacy
to that usability equation then what you’re saying is that you want to make
sure that individuals don’t have to search hard to determine what privacy
options they they choose or deny and they are presented with an
application that outwardly expresses a the value of the relationship between
the app developer and the user of that app so those are the seven principles of
privacy by design well there’s there is a lot to unpack
and in all of that I think the one that I wanted to touch on it was the one that
you actually brought up and said you want to talk on was number two of the
privacy as the default setting yes whenever whenever we’re talking with
clients about settings pages and you know it’s always funny when people when
you get to a point in in designing software and people say well we’ll just
make it an option and people can choose one way or the other it’s like I like
only 10% of people ever even open a settings screen of an app and look to
see what the settings are so as it relates specifically to privacy you can
extrapolate that and say you know if you aren’t private by design it’s probably
not going to be a lot of users are gonna go in and it’s gonna be people like you
and me that understand and most software developers that understand privacy are
gonna go in and make tweaks as necessary but if you aren’t thinking about privacy
at the forefront and and making the software as private as possible
from the get-go then most people are going to go in and change that that’s
right and this is my favorite personally but it is something that is I will
acknowledge is very very difficult for software developers to reconcile that is
because a lot of apps as we know are released to the public for free quote
unquote and because those developers need to you know a recoup the costs of
their development but also you know make a living off of these apps that is
really antithetical to the the notion of developing a system that is private by
design that doesn’t collect data by design because we need to
make money we need to make a profit as we are releasing these these apps to the
public there are a significant number of developers out there that are really
seeing the value of trust and authenticity as the key to standing out
among their peers and among their competitors and in effect giving them a
competitive edge by virtue of these design choices that outwardly express
this respect for an individual’s privacy right up on the get-go and if you’d like
I could give a great example they are real a real clear example of how this
plays out in the real world go ahead I every almost everybody has a
mobile phone nowadays like many people about five years ago I had really
resisted buying into for whatever reason Apple iPhone products I just I just
didn’t want to do it and so the the cell phones are the mobile phones that I was
using back then were obviously the the competitor or the competing version
which in this case was an Android and I stuck with it from that point on at some
point I don’t know three four years ago I was convinced to moved to an iPhone
and so I did I went with the iPhone and as I started opening it up and looking
at some of the settings I was surprised by something Apple as a
default setting for many of the sorts of applications and utilities and other
things that make an Apple iPhone and iPhone many of the most data intrusive
features are data how when I put it data the the the the leakiest
aspects of of interacting with the cell phone
those Mickey potential or points of potential leakage
were turned off by default in other words I needed to opt in to have apple
tree basically execute some of the features within the phone that were
leakier than others and that’s sent a huge message to me that told me that
Apple has actually thought about this privacy issue and have embodied has
embodied that whole notion of outwardly expressing respect for an individual’s
privacy through these you know really small signals but they were strong
enough for me to pick up on and really when you take a look nowadays at the
default controls that are in place when you buy a new Apple iPhone and you
compare that with its competitor you’ll notice a complete difference in the
philosophy behind how these phones are set up Apple iPhones have a much
stronger privacy controls by default then Google Android phones too and Apple
recognizing the value in that differentiator actually created a
commercial that touted how strong their phone was with respect to data privacy
controls and the the implicit message that Apple sent is you know look our
competitor that nasty Google company have you know they take all of their
information they did they just their phones are leaky as all get-out so you
should buy our phone because we help protect your data we don’t leak as much
data as our competitors do so they were they were literally using that as a
market differentiator I thought that was brilliant
I couldn’t agree more and that’s one of the big reasons I’ve stuck with Apple
III was I was one of the rubes that waited in line on day one for the iPhone
but but i-i’ve just you know every time I even get a notion of ma’am maybe I’ll
look over and see what’s what’s on the other side of the fence I just the first
thing I think about is the privacy implications and no one’s perfect I mean
there’s things like for example like last year there was a big brouhaha
around Siri and how there were in third-party contractors that were able
to listen to actual recordings that people made when they used Siri and you
know that was a big it said it was a big mess because of course when you do use
privacy as a differentiator then it’s it’s fair game when a competitor you
know when you do have a leak or when when something does happen you know
everybody latches on to that and says ha now now who’s the secure company but if
you know the Apple to their credit then with iOS 13 came out with all these
privacy regulations around Siri and and they don’t go to third-party contractors
anymore and they haven’t opted in when you first activate Siri to say hey we
can we want to use your voice recordings to improve the product by default we’re
turning that off and if you want to enable it we will enable it and then on
top of that they will come back to you after a few months and remind you and
say hey we’re still recording we’re still saving your recordings and and
monitoring them is that something you still want to do and to me there’s a
famous interview with Steve Jobs around privacy where he says something about
with with with regards to privacy that you need to ask early and often and just
keep beating down the door to enforce that because it’s so easy to hide behind
a well crafted privacy policy or your terms and services and and to mask what
your true intentions are with a user data but when you’re when you’re when
you’re being forthright and upfront about it I think the people that that do
care are gonna definitely recognize that – like what you said and and and you
know switch switch allegiances as a result of it yeah absolutely I mean what
you’re saying is an application not only of privacy is the default setting but
you’re also saying this the use of a positive some development
attitude versus a zero-sum because if we use the Apple phone the Apple iPhone
with these settings we keep these settings on in other words we do our
best to keep the phone from leaking that does not materially impact the
functionality of the phone itself yes it creates a few
impediments to convenience and being able to instantly share certain bits of
information but if you are willing to balance that with your desire for
privacy I think that the the notion that Apple iPhone is operating on a positive
some versus a zero-sum design approach is absolutely brilliant and I should
mention the you know you’re right for folks that put themselves out there as
being more attentive to data privacy yeah they of course expose themselves to
criticism I would argue that those organizations that are authentic and
that express themselves in terms of authenticity are the ones who are going
to win the day at they whether a breach happens or not because we’re human and
bad things happen mistakes happen and no system is entirely foolproof
it’s those organizations that come out right out of the gate and say look we
screwed up we apologize and here’s our plan to make make amends do do you know
improve our system shut down this change this whatever it takes
we don’t see that often enough unfortunately what we see is that when
mistakes happen whether a company puts itself forward as being a privacy
conscious or not organization when a bad thing happens when a mistake happens
companies tend to circle the Wagen and operate in this sort of does a code
of secrecy as it were and that creates a suspicious public because all sorts of
you know when when when folks don’t admit their faults right away
well that naturally causes us as humans or as in this case customers of a
particular company to say so what are they trying to hide and why and it
allows us to sort of imprint upon that organization a real sense of distrust in
the way they do their business I’ve also read though I don’t have the evidence in
front of me to share with you that the millennial generation really really
values authenticity and if an if an app developer is targeting that audience
then they ought to really think you’re clear-headed ly about this notion of
positive some versus zero some design approaches yeah that I as a millennial
IIIi don’t speak for every millennial but I can certainly say that that’s it’s
another reason why it’s a notch in Apple’s belt for for being authentic
when when they do own up to those two those issues if I’m so one one thing
around privacy by design if I am a product owner and I’m and I’m looking to
have somebody come in and build an app for me and I and I really do care about
privacy and and care about privacy by design are there tips or any sort of
ways I can tell that whatever team I’m hiring to come in and build out an app
does care about privacy by design oh I think that’s a great question the the
principles the seven principles are just that their principles and so they’re
very high-level statements they’re meant to be interpreted relatively broadly so
that they you know can be actually implemented in real life I would suggest
that as app developers or excuse me as a poner
are looking for development help that they should take these seven principles
and frame them as questions to their potential development partners and get
the responses back from those development partners so a question might
be what is your design approach to the settings within a system relative to
data privacy and security and just leave it open-ended like that see what the
developer comes back with I think that that would signal a lot for the owners
of apps and eventual apps is just to flip these principles into open-ended
questions yeah I agree like if somebody came to me
and asked me about that specific question um I I almost feel like it
would be B I would I would go off on a big rant I guess because I think it’s
you can tell the people that do care about it you can just tell the passion
it goes back to the authenticity point if somebody actually has thought about
this stuff and cares about it you’re gonna you’re gonna elicit a response out
of the developer where if if they ask the question and they just say uh I
don’t know what is that that that’s a clear indication as well yeah absolutely
that is a flag and you know that that ought to factor into the decision on
which sort of partner you you choose switching topics here a little bit when
we’re talking about incorporating third party dependencies I mean there’s so
many routes to go down I guess one one question right out of the bat is when
we’re talking about incorporating a third party dependency into your app
what what what does that even mean what are we talking about with with that well
you know in today’s world we are highly dependent on other organizations who
have established systems products platforms controls to build whatever it
is we want to build if if I am a small business owner and I have an idea for a
new my inclination because I’m a small
business owner because I lacked the resources to spend all the money
necessary on research and development and and all that comes along with that I
am going to be looking for established organizations that have already
developed the platforms and the protocols and the programs that I can
use as essentially a a platform to build my application sort of as a building
block approach which is substantially less resource intensive than it is to
try to create something on your own and so in with that in mind we are
oftentimes highly dependent on more than at least a couple if not many many more
outside parties helping to enable whatever it is we’re trying to make
happen within our software application or our web application and all of these
parties that you rely on of course they’re all third parties in most cases
they’re all their own separate corporations with their own separate
profit motives and and philosophies and value systems and so on and so forth and
when we have to rely on those organizations we are relying not only on
the fact that their product will you know make whatever it is we want to make
work but that they will do the right thing when they gain access to let’s say
personal data that we collect through our application and that’s where the
real balance needs to come in between they ultimately it’s it’s just as I need
to be able to trust that an app that I am interacting with as an individual
will do the right thing well business owners need to trust that their
third-party service providers will in fact do the right thing when it comes to
trading data in an ethical fashion and that I guess when it comes
to that it that’s what it comes down to at the end of the day is trust right are
there any like I don’t know I guess is there is there there’s no real way
around it at the end of the day we’re gonna need to include some third-party
dependencies as a boners I mean I I can’t I can’t even build my software
without relying on the third-party dependency of Apple offering a platform
for me to develop an app for Google or whoever but there are you know certain
third-party dependencies that we might take on you know because it’s kind of
either in vogue or it’s it’s something that it’s we think we need to have and
I’m thinking for example that one of the easy low hanging fruit would be around
login and authentication is everybody thinks well we need to have login with
Facebook and we need to have login with Google and we need to have you know the
the seven different abilities for people to log in through their social media
platforms but I think people sometimes add those dependencies in without really
thinking through what that actually means are there steps are things that
exercises that you might lead your clients through to have them explore
what third-party dependencies they’re using in their apps and think through
the implications of including those yeah that’s a great question I would before
even thinking about the issue of how do we manage third-party risk I would
recommend that app owners first to discover for themselves and actually
write it down in a piece of paper to make it to make it real for yourself to
determine where your where your parameters are what are your values as a
business owner what is your objectives for your business and if I would contend
that if one of those objectives is to maintain the trust of your clients then
that really informs the the approach that you take with respect to finding
vendor partners whether you know software developers or Apple or any of
the myriad applications that we used for very
spits of functionality in our in our dura webevs with that stated there there
are number of platforms number of frameworks in place to help us manage
third-party risk unfortunately for small and mid-sized
businesses these these platforms and these frameworks often times comes with
a with a cost and it’s not only the cost of actually buying into a third-party
risk management framework but it’s also the cost of the opportunity cost of
stopping for a minute and taking the time to assess the risk of your
relationships I recognize that as an app owner or a future app owner one of your
primary motives is to get that app out there as quickly as possible and so
anything that stands in the way of that creates friction is difficult I
understand that and so again I go back to the what is your value proposition
what are your internal values how do you want to express those and then let that
inform your business decisions on how much time and money you want to invest
in properly managing third-party risks that’s fantastic advice I think a lot of
times when you are a start-up and you have an idea and you go forward it’s
your a lot of time I know when I started my business it wasn’t like the first
thing I thought of the first thing I thought of was how can we make money and
you kind of rely on your own internal core set of values I suppose as it were
when you do that but then as a business as your organization grows you actually
do need to think about what are your what are your core values and why would
you include you know I guess if you if like you like you said if it’s trust and
and taking care of users information with respect that kind of lowers the
amount of dependencies you might want to rely on because yeah if you throw in an
ad network into your app and all of a sudden now you’re you’re not knowingly
leaking that information but those every single ad platform you include in
there is just another potential vulnerability for you so I think that
does make perfect sense to me well it really it really does it’s this
is not a easy I recognize that this is not an easy decision for app owners to
make because there’s really this this tug-of-war between wanting to do good on
behalf of your clients and future clients and maybe society as a whole but
then you have to put food on the table at the end of the day and how do we do
that in this new world of app development well especially if we’re
offering these apps for free or at a substantially subsidized price what we
need to make our money somewhere and so we do that by you know reselling
effectively reselling the the data that we collect through the use of our app to
other parties and so it’s a it’s a real it’s a real balancing act and it’s not
for everybody I recognize that the this sort of approach probably resonates more
for organizations that wrecking that recognize them that acknowledge that
whatever the app is that they’re building is something that that truly
rests on a foundation of trust and if that trust is chipped away in any
fashion that can be an existential crisis for that organization it’s it’s
these sort of organizations that I think the message will most resonate with man
there’s so much we could talk about with privacy and and and and we can really
get down the rabbit hole with it there’s there’s one last topic of conversation I
wanted to cover in this realm and it’s more of a fun topic for me because it’s
it’s such it’s so fascinating I wanted to talk about you know lately so we’re
recording this in February of 2020 and there’s been a lot of news back and
forth between the FBI and Apple specifically with regards to end-to-end
encryption and keeping users information private and just to give a kind of high
level overview and you can correct me if if anything I get is wrong here but from
from from what we’re seeing in the news
reports is you know the FBI is saying that they have a phone that that has
potentially relevant information to a terrorist investigation and they want
access to it and Apple is just not giving it to them
Apple’s story is we cooperate with you all the time they’ve they did something
like seven or eight thousand requests for subpoenas that they responded to and
helped where they could but Apple really is hammering home this they want to be
secure and private and part of that is encryption and what the FBI is asking
for is a quote unquote backdoor into people’s phones and what they what Apple
is telling them is that’s just not possible with with math and with with
encryption and whatnot it’s just not a thing that can happen and so there’s
this kind of epic standoff going on in the news and as somebody that’s
entrenched in this world of security and privacy like you are what are your
thoughts around this whole issue like III could probably guess where they’re
at but I just think it’s such an interesting topic and I think our
listeners might be interested to hear you know what you think about that yeah
I appreciate that the as you intimated yeah I do kind of land on the the
privacy advocate side of the argument they I do for my own reasons though and
that’s primarily because I am concerned that making it more convenient for law
enforcement to gain access to private or what we’re considered to be private
communications at the time whether those private communications were against the
law or not in my view is a different argument outside of the convenience
issue of the you know it’s more convenient if the government has a
backdoor into apples phone system versus having to go through whatever mechanisms
they need to go through to try to hack quote-unquote the system to get access I
know that I am quite concerned about overreach
and especially in this age where terrorism seems to be the the the
foundation of every argument that law enforcement uses to gain more and more
convenient access to communications that we believed were private at the time
that’s concerning to me primarily because of all of the things
that we’ve learned over the past several years the Snowden revelations the other
leaks that have come out since then these all sort of compound or one over
the other to create a sort of worry that you know Wow
the it really feels as if our government is trying to to we within little steps
that often go unnoticed by the public to kind of chip away at the protections
that app developers have put into their systems to safeguard that data I will
say that I doubt that Apple made these decisions out of purely altruistic goals
they they have a profit motive to consider and that is that if they’re
buying public fears that Apple will without much friction just kind of give
away information upon request by the government well then folks are going to
stop buying Apple phones eventually and so there’s there’s these two things
happening you know the privacy advocates are are hailing this as Apple standing
firm against government overreach and or perceived government overreach and you
have other folks in the business signs saying you know yep this is a good
business decision we don’t want to lose our client base because they perceive us
to be essentially bowing down to government demands for data and so for
me I I appreciation for the arguments that are going back and forth and I as a
citizen am concerned that making it convenient for
the government to gain access whenever they believe they have a need to
investigate this or that activity whether terrorism there otherwise it’s
it’s it’s concerning it seems to me that that if we want a fair and balanced a
sense of justice in art in our society that there ought to be some friction
there ought to be some hoops that government goes through to gain access
to data and in fact at least in in idiom in law we have these hoops they’re
called subpoenas and warrants and things of that nature unfortunately what we’re
seeing is a willingness of or of several organizations I think the the latest
news we’ve heard comes from those web apps that do the genetic testing and I
won’t name the companies but these folks that have the large genealogical
databases also have large databases of genetic material the these organizations
seem all too willing to share with the government not on a case by case
situation but rather on just sort of the sweeping you know let’s give you a whole
batch of data and you can decide among you know within that data what is
actionable in terms of like an investigation or prosecution and that
really hits at the at the the the the crux of of I think what Edward Snowden
was trying to expose was this notion of sort of blanket surveillance by the
government is really degrading and threats and threatens to degrade the our
understanding of what a democracy is it’s well said that I I I agree with
almost everything you said cuz I again it’s it’s so easy I think um the
the systems like you were talking about subpoenas and and and having to go get a
warrant those systems were put in place for a very specific reason and we’re at
this point in society where information moves so fast li that you know law
enforcement also wants to move just as fast as everybody else can move because
we all have that that capability to you know I can pull out my phone and send a
message in 12 different ways to if I wanted to talk to my wife right now for
example but if law enforcement wanted to crack all of those or get access to them
you know they’d have to go down to the courthouse and file a subpoena and and
you know that it’s just there there are it it’s not as fast as maybe they would
like but I think like you said as a as a private citizen I really much appreciate
having my privacy especially knowing how easy it is to to
lose it and and how much easier technology is enabling that so I really
appreciate you taking the time to come in today and speak with us Adam what
where can people learn more about SDS and and and how can people get in touch
with you if they have more questions about privacy well thanks to him I
appreciate it and I am really glad that you had me in today to take a little bit
of time to talk about an issue that I that I am indeed very passionate about
folks can find my organization at trust SDS that Sam David Sam calm that is
secure digital solutions website folks can also look me up on on LinkedIn
I am despite being a privacy person I use LinkedIn quite a quite aggressive
yeah so I I’m not a perfect person myself but the I am able to make my own
choices in terms of which secrets I choose to give away in which secrets I
choose to withhold and with respect to LinkedIn I am making choices to to
disclose and so folks can find my disclosed secrets on LinkedIn under Adam
stone privacy how do I build myself data privacy and security executive
and I would love to chat with anybody who is interested in the subject and
also interested in having some guidance around how to build a software
application with privacy by design as sort of the the the framework or the the
mantra with which we weave into every stage of the software development
lifecycle I love it because I think to your point about using LinkedIn as as a
privacy expert you’re still using it it’s this whole it’s a spectrum right I
mean it’s a neverending pendulum that we’re swinging back and forth between
having everybody have all access to all of our information and having no access
to any of our information and you know as long as I think people understand
what they’re trading by giving up their personal information we’re in a
democracy we can choose to do that and I think it’s it’s important as a poner is
also to realize that that it’s a spectrum and we just need to be
constantly vigilant to make sure that we’re we’re making all the right choices
and absolutely it really comes down to control and that’s that’s what privacy
is it’s the ability to control one’s innermost secrets which frankly the they
are the Diamonds a you know our psyche that we are trying to protect and and
sometimes we like to give diamonds away other times we like to keep them and the
ability to control that sort of selective disclosure is what it’s all
about couldn’t agree more thank you so much
for joining us today Adam thank you Tim great to talk with you a big thanks to
Adam Stone for joining me today on the podcast as he said at the end they’re
the best place to get in touch with him is LinkedIn so we’ll put a link to his
profile in the show notes those show notes can be found at constant variables dot co you can get in touch with us by emailing hello at constant variables dot Co
I’m @TimBornholdt on Twitter and the show is @ CV underscore podcast today’s
episode was edited by the yare Jordan Daoust one quick favor to ask of
you if you’ve got two minutes please head on over to the Apple podcast app
and leave us a review I’m sure you hear that all the time if you listen to a lot
of podcasts but it really does help our show rate higher in those charts so just
head to constant variables dot co slash review and we’ll actually just launch
the app right there and take you right into the app where you can leave us that
stunning review this episode was brought to you by the JED Mahonis Group if
you’re looking for a technical team who can help make sense of mobile software
development give us a shout at JMG dot mn

Leave a Reply

Your email address will not be published. Required fields are marked *