fbpx

S2E07: What Actually is “Big Data”? – feat. Data Evangelist Fru N.

S2E07: What Actually is “Big Data”? – feat. Data Evangelist Fru N.

On today’s episode we welcome Fru N. to help answer the question: what is big data?

We hear the term “big data” all the time. And as technology infiltrates more and more of our lives, tech companies learn more and more about us. Our preferences, our habits, our health, our relationships… And they do all that using “big data.” 

S2E07: What Actually is “Big Data”? – feat. Data Evangelist Fru N.

But what actually is “big data”?

We answer that question on today’s episode, as we welcome Fru N., a Data Evangelist and an IT expert with a master’s degree in Enterprise Architecture from Penn State University and the head of the Enterprise Data Ninjas group. He’s worked with corporations both large and small; including several fortune 500 companies in all the different core industries: Retail, Banking, Health Services, and Food & Hospitality.

We chat about how technology functions, how it is designed to function, and the various issues that go into its creation. We will talk about data, what it is and why companies and corporations are shifting into collecting the consumer and users data. This interview, which is readily understood by all listeners (no tech background needed!) will give you insight and understanding into the underpinnings of the new age of data that we are in and where we go from there.

In this episode you will hear: 

  • What is data
  • Just how much data is being collected
  • The seismic shifts in data collection
  • What is data marketing
  • Ways of data collection
  • Why this particular data evangelist doesn’t use his Amazon Alexa

Listen

Watch

Transcript

R Blank 0:01
On today’s show, I’ll be interviewing fru n. This is going to be a change up from our normal kind of interview. Normally the people I talked to are experts in health and wellness or public policy, or both. But as we’re talking this season about issues of our own relationships with technology and data, I wanted to bring on a specialist from a different perspective. The healthier tech podcast is about creating healthier relationships with technology. So I want to chat with an expert who can help illuminate this from the perspective of the tech itself, how it functions and how it’s designed to function, all the various issues that go into its creation, through as a data evangelist and an IT expert with a master’s degree in enterprise architecture from Penn State, and he runs the group enterprise data ninjas. He’s worked in Enterprise Architecture, data science and machine learning with several Fortune 500 companies in multiple data heavy industries, including retail banking, health services, and food and hospitality. Today, we’re going to chat about the new age of data that we’re in.

Fru N. 1:00
You know, having an understanding of what data is, I think is really live foundation for why we’re seeing this massive this seismic shift in in the industry are companies looking to collect data.

R Blank 1:11
Before we begin a brief word, this podcast is brought to you by my company shield your body, where it is our mission to help make technology safer for you and your loved ones to enjoy. Inspired by the life’s work of my father, Dr. Martin blank, one of the world’s leading EMF scientists I founded shield your body in 2012. And we provide a ton of great and free resources for you to learn all about EMF radiation, like articles, ebooks, webinars and videos. We also have a world class catalog of laboratory tested EMF and 5g protection products from our phone pouch and laptop pad all the way up to our bed canopy. All of our products are laboratory tested and include a lifetime warranty, learn more about our products, and why we have hundreds of 1000s of satisfied customers around the world at shield your body.com that shield your body all one word.com or click the link in the show notes and use promo code pod to save 15% On your first order with free shipping throughout North America, the UK and Europe. Hi, friends, welcome to the healthier tech podcast.

Fru N. 2:08
Hey, thanks for having me. I really appreciate being here.

R Blank 2:11
No, I really appreciate you making the time. So my listeners have just heard your intro. But could you tell us a little bit about your background?

Fru N. 2:20
Oh, yes, so So I am a fruit. I’m a data evangelist, I think there’s a term I really like to go with. And what that means is I help companies understand the value of data, understand the value of AI, machine learning data ops, and how they can take advantage of that to meet their business objectives to serve their customers better to innovate better, and to come up with better products. So that’s a lot of what I spend my time doing just helping companies be data driven. From a credential perspective, I hold a Master’s Degree in enterprise architecture. And it’s exactly what it sounds. So just helping enterprises align their business objectives with the architecture that supports those objectives. And it could be data architecture, Security, Architecture, technology, and infrastructure architecture, and just bring all of that together. So a lot of my time is spent in the data space. And that’s what I do on a daily basis.

R Blank 3:23
That’s great, because in this interview, I’m really glad that we were able to connect because I, the goal today is I want to help my listeners get some insight into you know, how the sausage is made. We hear all the time about issues regarding abuse of personal information, data, privacy data control, their terms, like GDPR, CCPA, CCRA, there’s a Facebook whistleblower in the news all the time now. And most of my listeners, like I think many consumers read about this stuff. And they’re kind of baffled about how these systems emerge. So that’s, that’s why I’m really looking forward to speaking with you today. So you can help educate us about how these decisions get made behind the scenes. As a starting point, I think it would be helpful if you could help us understand or visualize just how much data is being collected. I mean, we hear the term big data all the time. How big is big data?

Fru N. 4:24
Yeah, that’s a really good question. I think I’m gonna actually take a step back to answer that question to just talk about data, right? Because all usually we talk about big data. But you know, having an understanding of what data is, I think, has really lay the foundation for why we’re seeing this massive this seismic shift in the industry for companies looking to collect data. So at the fundamental level, data is just facts, if he goes that represent the world so just imagine you You know, long time ago, a caveman went out and hunted, you know, two elephants or two antelopes. And they came back and wrote that on the wall, you drew some writing on the wall. That was data that was collected at the time. Fast forward to the printing press, the Gutenberg Printing Press. And we had better ways of collecting data that was writing. Fast forward to the electrical devices with radios and television, we had even better ways of collecting data, which is a representation of the world. Fast forward that to the invention of the PC, the personal computers, we have better ways of collecting data. But ultimately, this is a representation of the world, whether we’re talking about someone going out for hunting or someone going to a sporting event. And we want to tell that story and collect data to tell the story. Humans have been doing this for a long time. The challenge that we’ve had in the past, where there was an asymptote was the ability to collect data. So humans always have this desire to collect as much data to tell a small story as possible. But the limiting factor, the asymptote has always been the technology at hand. In the past, it was drawing on cave cave paintings. But when printing came or writing, we could write books, but you were limited to the number of books you can write. Now, when we went into the Asia digitization, we had even more capacity to collect even more data. And so we’re seeing this massive tsunami, I call it of data collection, or data utilization, because technology is allowing us to do it. And so for the very first part, we’ve gotten to that age where both the technology and the capacity allows for organizations to collect data at scale. Now, big data is a piece of that, what is big data, when we talk about big data, I’m sure a lot of your audience might have heard about big data, the industry usually defines that in three using three V’s volume, velocity, and variety. So meaning, collecting data with larger volumes, so talking about, you know, not just your, you know, basically terabytes, we’re talking about petabytes, or even more of data. And this is huge. The velocity is how quickly can we collect this data, what is the speed of collecting this data. So in the past, maybe we can collect, you know, data every day. But now we can collect data every second are down to the millisecond. So the velocity has increased. The last piece of that v is the variety, meaning the types of data we can collect, has increased. So instead of just writing, being able to collect that, we can collect audio. So this podcast, we’re having the conversation here, is, is collected, we can collect PDFs, we can collect images, we can collect X ray scans, we can collect MRI images, you can send put telescope up in the sky, and you know, look at the Solar System, or you can collect data from that. So different varieties of data is being collected. And all of that the volume, the velocity, and variety is really what is causing this push of, of big data that we’re seeing is an interesting statistics for you. I think this will be fascinating to your audience, which is the data in existence today. You know, the last two years, we’ve created more data than was created from the dawn of humans prior, so Wow. So it’s an exponential growth. It’s not even a linear one. And it’s really, it’s really mind blowing, if we think about it.

R Blank 8:52
That and, and that’s gonna continue I’m at right, so in two more years, we could say the same thing.

Fru N. 8:59
Exactly. So yeah, in two more years, we’re saying exact same thing. And it’s, and it’s interesting to think about it not as linear growth exponential. So in two years, we create more data than we ever did from the dawn of time. And it’s partly because the technology is allowing, you know, asked to do to do so.

R Blank 9:18
So I know that and thank you for that was that was really helpful visualization. I really appreciate that. So I know that. I don’t know, it’s fair to say that you share many of my concerns about about data, big data, but but I think you you understand them. And but you’re also a big proponent of the value of big data. Can you speak a little bit to, you know, to that value, what value do you see in big data? And in particular, why should consumers be happy that big data exists?

Fru N. 9:51
Yeah, that’s a really good question. And it really comes down to a balancing act. that I see in the industry being a practitioner in the space. Being someone who makes associates as you, as you said, at the beginning, I get to see a lot. And there is a lot of debates of around data and privacy. And rightly so I think this time we have this conversation, and I think we’re going to get into that later on here. But the balancing act is understanding that there is also value in data, which is what makes companies be so attracted to it. And just to give you an idea, you know, from statistics, we’re hearing of a 1510 to 15 trillion in economic value to be gotten from data by 2030. This is what some of the research firms are talking about. Right? So when you hear about that 15 trillion of economic value, that’s value, right. And companies want to be a part of that. And that’s why we’re seeing this gold rush, where companies are looking at it, executives are looking at it and saying, Well, how is my company positioned to be a part of that? How can my company deliver and gain part of that value to be delivered in order to be realized by 2030. And so this is really a push about economics is a push about, you know, survival for a lot of companies, they want to be relevant. And it requires them, you know, taking advantage of data. Now to go one level deep into into that, from the different industries. So whether we’re talking about healthcare, or we’re talking about education of politics, data is providing value across the board. So just take medicine, for example, which is one, a big field that I play a lot of role in. Personalized Medicine is a huge thing in the industry today. If you go out to get, say, a cancer treatment, someone going out to get cancer treatment. Now they’re looking at saying can we get more data about the cancer, that’s affecting people. So they can provide more tailored specific medications to a specific cancer based on the specific individual’s profile. And it takes data to be able to personalize medicine to that level. And what that leads to is better outcomes, better health. And I think a lot of people would appreciate that, that if they can use data to make people live better live and be more healthy, people could find that good. Another area is education, using data for more tailored curriculums. So not every student is going to learn at the same pace or understand every subject equally well. So can big data be used to understand individual students and the areas in which to struggle to come up with, you know, personalized education for each student? So instead of this, you know, massive fire and forget kind of approach where a teacher just stands in front of the classroom and regurgitate the same subjects to every student? Could it be a student who is struggling in algebra or linear algebra, calculus gets more emphasis in that area compared to somebody who’s not struggling that area, and it takes data to realize that go to politics, right? Go into medicine, going to sports, almost every industry and I can go on and on banking is seeing value from data. And that’s why we’re seeing this push of companies saying, Okay, we want to be a part of this gold rush and collect as much data as we can, and use that. Now there is going to be downsides. And I can talk about that, too, as well. If you

R Blank 13:44
Yeah, no, no. And we I think we will definitely get to that. But I thought that was really helpful, because I think, you know, when a lot of people read these stories in the news, you know, like the Facebook whistleblower, you know, they hear about big data, they just are they assume Big Data is about how Mark Zuckerberg figures out how to give you an ad from Jeff Bezos to buy a new, you know, whatever product you want. But you’re showing that they’re their actual, there’s actual tangible value for consumers, in terms of for them in terms of better health care, better education. And those are very high value. Those are very high value subjects. So I appreciate it that that explanation and before we get into some of the controls and the cultural issues, there’s just one more kind of buzzword or keyword I wanted to throw into the discussion. And that’s based on this big data. We’re hearing a lot more about the advance of things called machine learning, and AI or artificial intelligence. And I know I know you have expertise in this. So I’m hoping you could take a moment to explain in very basic terms, what these technologies are machine learning and AI and and how the increasing prominence of that impact data control and privacy?

Fru N. 15:03
Yeah, absolutely. That’s a very good, good question. machine learning and AI is a big is a big part of the industry, which is growing at an exponential rate. With all the data available. What machine learning does is it helps using algorithms and, and specialized, you know, statistical models to gain insights from data. So we’re talking about not just looking at reports, so being able to report numbers and statistics from what happened, but being able to get into more prescriptive types of insights. So let’s take an example. Let me let me use actual examples here. Let’s say there is a sporting event happening in a town. And you know, people go to a sporting event, you can use machine learning to run past statistics to say, Okay, we had 5000 people attending that event. But can we take this one step further to predict that for the next sporting event, how many people would come in there, right. And so this is where we’re seeing a lot of advancements in, in machine learning in AI in neural networks, deep neural networks are coming up with very specialized algorithms to help with problems like that, you know, one of the biggest challenge in the industry was vision, or is vision. So being able to drive cars, so can you put the car on the road and have that car drive itself, by looking at its surrounding gathering data from its surrounding, using cameras, or lighters, or different technologies, and be able to navigate itself without hitting pedestrians without, you know, just making a mess. So there’s been a lot of advancements in AI from that perspective. And so we’re seeing that in the real world with, with Tesla, and with a lot of different startups that are coming up, being able to promise, the idea that we’re gonna have self driving cars on the road. And that all comes to, that’s all a result of, of, of AI. Now, AI is a very interesting area to talk about. And I love this area, just because I think is the next frontier, because it brings together three things, three key pillars. Number one is the data itself, you cannot really have aI without data. So companies have to collect this massive data. The second piece is being able to have the algorithms that analyze the data, so these algorithms that are making decisions, right, and so we’re seeing a lot of advancements in that with neural networks and, and different algorithms in that space, extra boost, people will will know that area will understand that. And then the last piece of it is the hard way. And so with hard ways, will you hear of advances in GPUs and CPUs and really fast, hard ways that you can have a watch that I can look at your face and answer phone calls. And we have had ways that can be miniaturized to the level to allow for this machine learning models to work. Now, when I talk about machine learning, the biggest area of concern that people and I think your audience would really appreciate this is what is happening is, are we getting to the point of letting machines make decisions? Because if you think about it, are making decisions is a very sacred thing as a human, right? I always imagine that if I have to decide to do something, it’s it, I’m making that decision is my choice to make that decision. Now, if we are getting to the point where machines are getting to make decisions for us using algorithms, what does that mean for the sacredness of making decisions? And I think this is where that debate is coming up a lot. You know, how much should machines have? What authority? Or what latitude? Should they have to make decisions for us be the decision of what show I should watch on Netflix? Or what recommendation engine or what you know, if you go on Amazon, and you’re looking at a recommendation engine there, if you buy this, then you can buy that, right? They recommend that to you. Or if you’re on YouTube, and you’re watching this podcast, on the side, they’re recommending other podcasts you should watch. If you’re on Facebook, and you looking at a friend, they’re recommending other friends that you might like because of this, right now machines are getting to make those decisions and put out those informations for for us. And this is where the result friction because people find that kind of I would say creepy to a certain extent. And that’s where we’re getting a lot of the debate around around that.

R Blank 19:48
Yeah, and you know, the machines might be making those recommendations, but the humans decided what recommendations the machine should be making. So

Fru N. 19:56
exactly. Yeah, you don’t really talk about this algorithms. Yeah. Yeah,

R Blank 20:00
no, no. So when you talk about that tension, like, how bring us in, because you’ve been in the room when these types of discussions happen. And obviously, we’re not talking about any specific client or company, in particular. But in general, you know, what do those discussions sound like? How do these decisions get made? Like, what what should we, you know, how invasive should we be? How, what should we be recommending? how expansive should we be?

Fru N. 20:28
Yeah, I think that’s a really good question. And I’m gonna, you know, try to keep it at a very high level. Yeah, just to give a sense of it. If you think about companies, there’s always an idea of the bottom line, and even the top line that you need to meet. So a good example is, is, is a median advertisement space for campaign and effectiveness. If you go out and you shopping for a car, the moment you go to Google, and you type, you know, Toyota, right, and then all of a sudden, you go to Facebook, you see those cars around, you go to LinkedIn, you see the cars around and, you know, this ad starts following you around. Another question is, the companies that are making that decision, what is their motivation? Ultimately, they have a very simple motivation, right, which is the want to sell cars, they want to meet the revenue targets. But should that be the only motivation companies have and this is why you seeing the push of ESG. So the environmental, social, sustainability, and governance really come into play to say, You know what, sure, you can make decisions that would meet your bottom line, that’s one way of looking at it. But you also have to be ethical, you also have to take into consideration the people you’re making the decisions for. So if a company can meet your bottom line by buying ads from Facebook, to target you, and follow you around as you go from one device to another device, just to pressure you into buying a car, at what point do they have to stop pushing you or giving you the option to opt out from that, from that campaign, if you don’t want to be a part of it? And I think this is where we’re seeing the push and pull. And I have talked about this a lot, right? It’s gonna have to be three key pillars happening here. Right? There’s the first one is the technology, the technology allows us to do it, technology can do a lot of things, right, there are technologies out there that can literally watch you in your room and know what you’re doing listening to what you’re seeing in the room. I don’t even get me started with that, because there are just devices and gadgets out there that are very capable of doing that. Right. And the heat is creepy. And it is. And that’s why having this conversation is a very worthwhile, discuss because people then get to be aware of it. And they can make a very informed decisions about how they use these technologies or not use the technologies. So number one, the technology is very capable of doing things. The second aspect to it is, is politics. If technology can do a doesn’t mean we should do a and that’s where politics comes in to say, Okay, let’s put laws this put the GDPR or let’s put the CCPA. And all the regulations in there to limit technology from doing that, you know, an example, outside of technology that has happened is think about mining, think about, like the oil pipelines, right. And I know this could be a sensitive, sensitive area for some audience. But there are companies in there that could literally mine any way, take resources from any part of the of the country, and pollute every single river stream that there is out there to to meet their bottom line and to make profits. But we know that there are laws and regulations in place to limit companies from doing that. If you’re going to mine and pollute this river and the stream for everybody else, then you’re not undermined, right? Even if it affects your bottom line and laws have been passed to stop that. And so what we’re seeing is those laws are coming into the digital age, to protect citizens to say, You know what, you might want to meet your bottom line. But if you’re gonna do that at the expense of of citizens and users, then you’re gonna have to put laws. And this is where GDPR came into place. CCPA in California, and a lot of states, Nevada, Maine, or poised to pass your own regulatory laws to say, this is the kind of data you can collect. This is how you can use that data. This is how you should not be able to use that data, or you have to give people the option to opt out if they want to delete your data if they want to have that data deleted. And laws are being passed. Now, so last

R Blank 24:43
concern, just Could you explain just for people who haven’t heard of these terms, what GDPR is and the CCPA in California and so forth, just what what are those stand for and what are those laws?

Fru N. 24:58
Yeah, so very good question. So GDPR stands for the general data protection is a regulation by passed by the EU, to limit how companies can collect and use data within the EU ecosystem. So it applies to EU citizens, if you’re a citizen of the EU, block GDPR applies to you. And there are seven pillars for GDPR. What it means is there has to be the data being collected, has to be limited. So you have to collect data just for what you’re using. It has to be accurate. One of the key pillars in there that I really enjoy personally is the right to forget. So if you call up Facebook, or you if you call up a company that is part of the GDPR. Blog, and you say, hey, I want you to remove every data you have about me in your systems in your databases, companies are obligated by law to comply to that, otherwise they get fined. So a combination of limiting the amount of data that is collected, providing transparency for the way data is collected, I’m sure if you go to websites today, and you see there is many websites will have the something about cookies do you accept, right? A lot of that is in response to the GDP GDPR. Law. So you have to opt in to say I want you to collect my cookies, right. And if you don’t want to you can, you can deny that. So that’s GDPR from the EU, in response to that in the US, California did pass the CCPA, which protects citizens in California. And basically, it stands for the California Consumer Protection Act. That’s what CCPA stands for. It provides a lot of what GDPR does with you know, a few things that are limited to California citizens and for the states are very much in line to passing their own laws. So if you live in a state that’s other than California, I would encourage you to check in with your your state laws and see what’s going on what the legislation is doing. I do know a lot of states are very, very close to passing the IRS, or they are adopting something very similar to what CCPA is. But there is a groundswell around this regulations, and a lot of discussions happening here in the US and even outside, you know, in Brazil, in Chile, in South Africa, and China, Japan, almost every country is responding to this privacy concerns by asking the legislations to pass laws to protect the consumers.

R Blank 27:47
So the and thank you very much. That was a great explanation. One thing and there are a few points you brought up that I’d like to get to, but one is just GDPR, or the you know, any of the similar pieces of legislation you mentioned. Do they talk about internal controls within a company? Right. So the characteristics you described, outlined, sort of controls that consumers have over the ways that companies handle their information. But and this might be a silly example, but I’m thinking just in the past couple of weeks, I think it was Amazon fired an employee who had been checking on the orders of I believe it was an ex girlfriend, or something. Oh, yeah. Yeah. And and you hear about stories like this every so often. And I’m wondering, do these regulations protect consumers against internal? Or do they provide internal control regulations for internal controls at these companies?

Fru N. 28:49
Yeah, absolutely. That’s a really good question. Because from the surface, the, the rights and it’s almost like the Bill of Rights, that in the GDPR is simple. It stipulates, you know, what is required, right, the right to be informed the right to access or rectification erasure and the other rights, that’s what it says Now, how that gets implemented behind the scenes is it doesn’t it’s not very prescriptive in that nature. But to comply to it, companies have to rethink their internal processes, very similar to Sox here in the in the US. So Blaine, Sarbanes Oxley, back in the 2000s, that was passed from a financial perspective. What that means is, companies now in in the trying to meet the GDPR requirements are building internal infrastructure, internal systems with governance to comply to that. So one area that I see a lot is, is let’s take lineage as an example. So, lineage meaning, thinking about how data is collected What sources is data collected from? where that data is stored internally within the systems? Who has access to those systems? Who queried and this is where the that example you brought up is a really good one who has queried that system at any point in time. What records did they see? Why did they touch that? Are they authorized and authenticated to touch this authorization authentication to touch the system? And for what reason? So if there is an audit that comes in to an organization, they have to prove that they have to prove that they are GDPR compliant, or HIPAA compliant. HIPAA is another big one, especially for for the medical space, right. And part of that compliance is being able to prove those things. And the case in point you brought up for companies that don’t do that, and you have the rogue employee that goes and does something unacceptable, then there are fines, right? So it’s not just a goodwill effort, it’s not just Okay, we’re gonna trust you that you secure this data. There are audits that do come in to say, how are you securing this data? If somebody queries this data from your database, to look up? You know, the orders of the girlfriend? Or there was something I believe was that in Florida, of some cop, looking up the records of some, you know, it seems to happen with with romantic relationships. Yeah, it definitely seems to happen a lot like that. And they were looking up the records of some other lady that was arrested to use it for some romantic interest, right? But why did they do it because the system allowed them to do it. So it’s not just a good way of thing, you have to use technologies, you have to use systems and tools that even if somebody wants to look at it, they will prevent them from doing it. And there is a lot of software in the industry, hopping around that from a governance perspective, tagging and lineation. And all of that to solve that problem.

R Blank 32:04
So you mentioned the right to forget. And you know, when I first contacted you about this this interview, I asked you specifically about control over pasta, Miss data. So what happens to your data after we die? And so second, coming from the question about or the point about the right to forget? It just just to get get? You know, because a lot of people I think don’t even think about what happens to their data after after they die. In your experience, working with companies and building enterprise scale, big data applications? How is well, first off, is this issue addressed to do? Is there planning, contingency planning for consumers being or for obviously, relatives of consumers being able to, to manage data about someone after that person passes? Is that addressed? And if so, how?

Fru N. 33:01
Very good question. So I would say it’s, it’s something that we’re seeing a groundswell around, especially with the wake up call that companies are getting as a result of GDPR and CCPA. They are responding to this. And it’s very interesting that the right to forget this is one of the pillars in in GDPR. And what I see on the on the enterprise side is companies are responding and the systems are building is, is built to accommodate things like that. So both on the technology allows for it. So with the term I use is tagging. So even if a data set is coming in, and you have the appropriate tags, to say this is sensitive, sensitive data, or this data is PII information, or this is health information, and you can tag the data appropriately. And you know what user that data applies for. So if you call up if I call up to say Facebook, and I want them to get rid of all my sister or my data, number one is knowing where they have my data stored. And, and this is a wicked problem, right? And I don’t want to make this seem trivial, because it’s not just go to my this one database, and you’re gonna see all the customer information and you can delete that information. And you’re good, right? It’s not that trivial. It’s literally, especially for these larger organizations I work with, they might have a CRM system, and CRM is basically a customer relationship management system. They might have an ERP system, enterprise resource planning, they might have a marketing system for running your marketing campaigns. They might have, you know, a sales system and they could have all these different systems where you have pieces of your data scattered all over in an organization. So when you make that request to that company to an organization to say I want all my data deleted from your entity, price. Now the organization first needs to know where that data is stored. And this is having a solid governance infrastructure is key from the beginning. And with the right tax and the right policies in place that companies can build, they can say, alright, your data is in this system, this system, this system, and we can issue a command to go delete from those systems. The organizations that are on the forefront are implementing that. And I can tell you, a lot of organizations are hiring, you know, people to come in and ensure that they have the capacity to do that. Of course, not everyone is on the same maturity level. But if you just do a Google search on roles, like chief data officer, or Chief Information Officer, or chief privacy officer, that, that those roles are increasing in popularity. Why? Because companies need people to come in and make sure those privacy things are in place. They understand where that customers data are all stored, and they have the technology to delete from those systems, if that request comes in, because if you don’t, there’s going to be a huge, huge, huge, fine. So a lot of companies are really responding to that from from what I see.

R Blank 36:15
So when you were talking about GDPR, you know, that that I feel was, I mean, you mentioned that there’s, you know, an equivalent piece of legislation in California, and we’re starting to see it in some other states. But obviously, GDPR is a European initiative. And I feel like it kind of emphasizes or highlights some of the cultural differences between the EU and the United States. And we’re actually seeing that in a few episodes, this season of the podcast. And I, I know that you really are a big believer, that, that that’s society’s relationship with big data is determined by culture. So I guess what I want to ask is, What does culture mean for big data? And conversely, what does big data mean, for culture? How do these forces impact each other?

Fru N. 37:05
Yeah, that’s a really good, a really good question. Because we always think about technology as being on the frontier and driving innovation, pushing the pushing the boundaries and, and going out there. But we live in the culture, especially in the US, which is, you know, from a political perspective, and I’m mad savvy at all, when it comes to politics, but I would say for the most part is it tends to be very centered, right, not to the left not to to the right, people don’t want to rock up things too much. And people want to be conservative in terms of their privacy and their data. And, and that’s why, you know, even going to the legal side, there is you need warrants to come into somebody’s house and to search, right, you can just jump into somebody’s house, and knock down the door. And, and, and, you know, search for whatever you want. And so there’s always been really from the inception of this country of the US, that idea that you should have a secret space that privacy is a first class citizen here. And that should be respected. If you’re going to, you know, intrude into my private space, you should have very good reasons to do so. And even if it’s the government, the government should have really good reasons to do. So that hasn’t always been the case, we notice of different scenarios with the NSA and some of the things that they’ve done that might have gone past that. But culturally, people still have that expectation of privacy. And so we’re seeing this push of the innovators coming up with these technologies, and they’re running, you know, 100 miles per hour at the front. But the culture is saying, you know, we gotta slow down here. And we got to make sure that let’s think what we’re doing, if we’re going to use an algorithm, is this algorithm ethika? If we’re going to collect data, are we collecting the right data. So there’s really this balance between the culture and the terminology, technology, I describe it as three pillars at play, there is the technology, there is the politics, and then there is culture, and each of those are influencing each other. Now, if technology goes too far, what happens is culture and people would say, we don’t want this, we want the laws, we want the privacy, and they’re gonna push politics to pass those laws that they miss technology. And it’s a it’s a union, right? And then you can also have technology changing, influencing culture to get along. So it’s a balancing act of, of culture and people getting used to their data being collected, and if they don’t want it, they respond by asking the legislators to pass the laws to limit that. And it’s always a moving target. I wouldn’t say culture is static. We all could agree that culture isn’t static. You know, when I was young, and I’m not gonna date myself here He just was, it was terrible. It was terrible people could not imagine. Like, imagine having a device in your, in your room that has a camera. And that device is connected to the internet and can listen to what you’re saying. Like that. Would people couldn’t even imagine that 10 years ago, right? Now, there are people who find that, okay. And there are a lot of people who don’t find that. Okay, and I’m sure your audience, you know, would fall into into some of those categories. And so there is this culture shift where some people find out okay, but no, those two vast majority of people that that don’t find it, okay, and they want politics to respond to limited knowledge, and that those that rebalance is what I tend to see in the

R Blank 40:48
industry. So I’m gonna, as we’re getting closer to the end here, I’m going to ask you to take off your, your, your professor hat and put on your, your advocacy hat. And, you know, what, what, with everything, everything you just said in mind, what changes? Would you like to see in terms of how big data applications or envision design deploy to managed?

Fru N. 41:15
Very good question. And I’ll answer that with a story here that happened. For us, it was a big, big deal here. In in the Midwest, where there was this large multinational company, and if it’s in a retail space, and there was a husband, wife, or a father that had a daughter, and apparently, this daughter was pregnant, and the father didn’t know that. And so the father kept getting meals in the in coming to the house delivered to the House talking about pregnancy, buy stuff for kid. And the father was just shocked, like, why is this company sending this this coupons to their house, targeting the daughter. So he, he took that stone into that into the location, I was asking to talk to somebody, you know, my daughter is still a teenager, I can’t understand why you guys keep sending us these things to promote, you know, pregnancy, are you trying to get her pregnant. And long story short, and you can read about this in the news, the father eventually found out that the daughter was actually pregnant. She did not even know she was pregnant. But that organization had found out that this is true, true story, you can read about it in the news here. And the daughter didn’t even know she was pregnant. But that organization somehow using data knew that she was pregnant, and how do they know this was based on the buying pattern. So if you walk into a store, and you buy, I don’t know the exact stuff that she bought, that led to them figuring this out, but if you buy a combination of, of maybe sweets, and, and tissue and this thing, you know, people who tend to buy those combination of things are people who are pregnant, and so they can use this skill to look at the population to now figure out that some lady is pregnant before she even knows. This is what scares a lot of people when we share stories like this, because somebody somewhere can look at it and say, yeah, it helps us sell more tampons, or helps us helps us sell more stuff in the store. But is that really what you want? Did you get the consent of that, of the of the customer that that’s what you should be doing? What does that mean for the relationship between the dad and the daughter for them to figure figure this out that way? So these are the ethical questions that even if companies are able to meet your bottom line, they have to grapple with technology gives us the capability to do a lot of things, a lot of things. There are a lot of questions you can ask if you have the data on hand. The question is, should you ask that question? That’s really what companies have to grapple with.

R Blank 44:06
That there was a really powerful anecdote and, and yeah, no, I think that was a really great answer to my question. You know, it makes me wonder, just just out of curiosity, do you do you have an Alexa or a Google Home anywhere? Where do you live?

Fru N. 44:22
is actually that’s a really good thing. Good question. I have my my girlfriend got me did get me one. I can tell you I haven’t. I haven’t used it. I consider myself a digital minimalist I, you know, I use I’m a technologist, but I’m a minimalist too. So Alexa, and some of those my phone is more than enough for me to handle. I don’t know if I get a lot of value for some of those. But so before

R Blank 44:53
before we sign off I wanted to ask, I see you’re a volunteer for an organization called books for Africa. So I was wondering, you know, what, what, what that organization does? And I guess one reason I want to ask about that is given your IT heavy background, Why have you chosen to channel your volunteer efforts on such a low tech initiative?

Fru N. 45:15
That’s really a good question. And thank you for asking that. Because I mean, I’m a huge believer in literacy, I just think literacy is the biggest liberating factor you can offer any individual. Originally, I came from West Africa grew up in a very, very, very tiny village without a lot of opportunity. I share the story of you know, today, I work with high tech, I speak with companies across the world travel to different continents to talk about technology, and how companies have these problems. But my background really was, was very low tech, I learned computers in an environment where we literally didn’t have computers, and they had to the professor or the teacher had to draw on the board to say, This is what the computer looks like, this is what the mouse is. And, you know, that’s how I got my introduction to computer. So I’ve always been fascinated by the power of computers for good, right is obviously the challenges that comes within making sure we’re secure and governed. But you know, as a little kid, I was always fascinated by that. Now, the books were Africa that I, that I volunteer, and I support and I care a lot about is helping bring the knowledge and the literacy to, to the African continent. So it collect a lot of books, ship continents of books, to different areas. And books have been a time tested medium for, for knowledge sharing. And so that that’s, that’s a really good Avinu. Now, why books, I’ll give a story I visited my village, that was back three years ago. So you go into some areas, and there is no Wi Fi right? That would be that would be No, I know, some of the viewers might be surprised that you go to an area, there’s no Wi Fi in 2021. But there is there are areas that there is no Wi Fi. And so if I took a podcast or video and I wanted somebody to watch, they just cannot even watch it. Because they don’t have that that connectivity. But you can get it booked to them. Right, there is nothing that limits you from getting a book to them. So. So that’s why I get really passionate about books, I’m a big book reader myself, I’ve written books myself. So being able to take that knowledge and democratize access to it at scale is something that I’m extremely passionate about. But as technology comes in, and maybe the medium of consumption grows, it might go beyond books, but books have been around for for for centuries. And so it’s, it’s a very, it’s a very time tested medium. And that’s what I care a lot about.

R Blank 47:56
Well, so fruit, I really want to thank you for taking the time today, this has been a really insightful explanation of a lot of the I mean, these issues are kind of core to modern society. But it’s hard to get a just a real explanation about about what they really are. When you see it covered in the news again, you know, it becomes very kind of sensational and abstracted. And I really help appreciate how you’ve helped us kind of get into some of the details for a better understanding.

Fru N. 48:26
Yeah, and thanks for that. And the last thing I’ll just add there is as we’re talking about literacy here, what we’re also seeing is this push for, and that’s something that I I’m also really passionate about is the push for data literacy. And, and we’re seeing a groundswell around that. Because people tend not to fear what they know, if they don’t understand data, they don’t understand why data is collected. It could be very scary. And so a lot of companies or organizations are investing in data literacy, to help people understand what is data, why is data being collected. And even two basic things like the cookie settings on your browser, there is a whole big push that is happening right now have lit being led by Apple, taking away cookies from browsers. And so companies like Facebook and Google are all trying to respond to that, to come up with new models for being able to still run their business by selling ads. And so the technology is changing. There’s a lot of groundswell and changes coming up with basic settings of cookies. I would ask an average person, do you know on your browser, where to go and kind of the cookie on your browser, if you don’t want a company to get you your data and follow you around as you go on different sites, being able to do that? I think it’s a step, right? And this data literacy and privacy literacy is something that I that would be very relevant in this whole thing. Because of, of privacy and governance and security for, for individuals to keep themselves secure, while allowing companies to still deliver the value that they want to deliver, but making sure that they do so by respecting people’s privacy, privacy being an opt in solution, not an opt out solution, that’s where we’re seeing that that change going, you should opt in to get know the ads. Right now we are in a situation where you have to opt out. So you opted in by default, and you have to physically opt out. By 2023, I can tell you with the way Apple is going and how Chrome is going, is going to go to opt in bye, bye, bye. You’re going to have to opt in, if you want to be part of it, and you’re going to be opted out by default. So that’s going to be a huge change that people should be seeing.

R Blank 50:57
So where can where can people connect with you to other chatter to learn more? Is there a URL or social handles you want to share?

Fru N. 51:05
Yeah, thanks. I’ll do a little bit of a plug in here. Yeah, I’m available online in many different spaces. I talk a lot about technology, I do have a channel on YouTube called Tech with fruit, where I discuss topics like this related to data related to technology related to algorithms. So if anyone wants to check that out, take with fruit on, on, on YouTube, or you can check me out on my blog where I write about emerging technologies at Tech with fruit that IO. It’s, it’s an amazing blog, I just put the content in there that could be relevant for folks that are interested in hearing about these topics.

R Blank 51:44
Excellent. Well, again, for thank you so much. This has been a really great interview, I really appreciate you making the time.

Fru N. 51:52
Thank you guys. It’s a great platform to have, and I appreciate you having me on the show.

R Blank 52:03
Wow, well, that was that was really, that was really insightful. As always, I am. I’m joined here by my co host, Stephanie. Stephanie, what did you think?

Stephanie Warner 52:15
Wow, wow, that was a really interesting interview. And I will say I feel much more data literate and comforted by the fact that I now kind of understand more about the scope of that how data is collected, and the conversations that companies are having, as well as governments around ethical use of data.

R Blank 52:40
Yeah, and he’s spoken such such accessible language, which is, I feel like it’s kind of hard to get discussions that go into this kind of depth and detail, but are also still accessible to people. So I really appreciated that, you know, a couple of things

Stephanie Warner 52:55
about data as well, like, interesting and accessible.

R Blank 52:59
Yeah, yeah. So a couple of things I took out of that. Was it? And, you know, I seven, you know, I have a background in tech. I mean, not, not for a few years, but yeah, I spent decades in it. But even so, you know, just getting a better appreciation of just the scope of what big data means today. I feel like that was that was very illuminating.

Stephanie Warner 53:25
Oh, yeah. You mean how in the last two years, we’ve collected more data than since the dawn of like, man. I noticed that as well. And I do mean, definitely an eye opening a statistic or a way to gauge how much data were collected? Because, yeah, in two years, we’ll have collected, you know, twice that, but probably more as the capabilities also increase. Yeah. So and the velocity also increases. So yeah,

R Blank 53:57
and another thing that I thought was, was really useful was because, again, you know, when when you see this covered in the news, you see things like, like, it’s all about advertising. And obviously, advertising drives a huge amount of this business and, and online commerce and business it overall. But I think the examples he gave in things like health care, and education, really do show that there is value being created for consumers out of big data. Now, that doesn’t mean, you know, we still shouldn’t have these important discussions about where boundary lines are and what controls should exist. But it’s not all just about advertising and Facebook, big data is providing value for consumers in multiple areas of their lives.

Stephanie Warner 54:44
Yeah, I mean, I think that it’s understated in at least in the United States about how you know, the health aspect and the education aspect and I really was happy that he brought those two components in because they are so key, there are so many ways that we can collect data that will help have tailored medications for individuals. You know, and I believe he said better life and better health. And that’s really, really important. But also what I loved is the factor, the education factor, the concept of having tailored individual, personalized approach to each student. And I think he said, as an example, if a student is struggling in one subject, give them more time with that subject, rather than an equal amount of time over across all subjects, including ones that they don’t struggle with. I just thought, well, da, what a great approach and a great like, you know, great use of data to help make the world smarter.

R Blank 55:50
And once once again, you know, because we’re seeing this in a few episodes this season, but the discussion today just showed, again, how much further ahead, the EU is in the United States in terms of confronting and managing issues like this. I mean, in today, you know, he brought up how, you know, for instance, California has a law, that that’s very close to the European law and other states are, are in various stages of enacting laws like this. So at the state level, it’s not the same story across all 50 states, but as a country, the US just feels really far behind on a lot of these core issues that are just central to modern life.

Stephanie Warner 56:32
Yeah, it’s I agree. And, yeah, it’s, it’s, I don’t know what to say other than I agree.

R Blank 56:39
Yeah. Well, and just just before finishing up, there was one other thing I wanted to highlight. And I just because, you know, I’m kind of the same way he is. He’s a, he’s a tech expert, who’s a tech minimalist. And he doesn’t, you know, he’s a big believer in a lot of this stuff. He’s not only his career, he’s he clearly, he clearly feels for it, he sees the value in it, he sees the importance in it. And yet he still doesn’t get every device that he can. He in fact, when given an Alexa he doesn’t even use it. That I just love that. That’s so that’s so much part of the healthier tech message. And I was I didn’t know that when I booked him for the show. But that that I really appreciated him pointing that out.

Stephanie Warner 57:24
Yeah, and I I think that that’s a really good approach is the approach that that I think you and myself both have as well where you know, just because we we have the option now we kind of discern a little bit like yeah, do I actually need Alexa? Probably.

R Blank 57:41
So why ask Alexa? Hey, Alexa, do I really need you

Stephanie Warner 57:46
know, I’m Team Siri.

R Blank 57:49
Oh, even better as Siri if you need Alexa.

Well, that does it for today’s episode. Remember if you liked this show and want to hear more, please subscribe to healthier tech podcast available on all major podcasting platforms. And if you have a moment please also leave a review. Reviews are really critical to help more people find this podcast and learn about the important and undercover topics that we discuss. Also, you can learn more and sign up for our mailing list to get notified when we have new interviews, webinars, ebooks and sales at shield your body calm. You can also just click that link in the show notes. While you’re there at shield your body calm you can check out our world class catalog of laboratory tested EMF and 5g protection products. Don’t forget to use promo code pod to save 15% On your first order from shield your body.com with free shipping throughout North America and Europe. Until next time, I’m R blank. And I want to thank you so much for tuning into the healthier tech podcast. Always remember to shield your body

Transcribed by https://otter.ai

the 5 biggest sources of EMF & how to protect yourself for free

Cut Your Exposure to Harmful EMF – Right Now

Grab your copy of my free guide with 5 ways to start.

Related Articles

About the Author

Have a Question?
headshot-circle-200x200-20190813.jpg

I take pride in designing great, effective products, based on real, measurable science – AND taking the time to ensure that each and every one of you has the information you need to understand EMF and make informed decisions.

So if you have a question, just email me and ask.

rblankSignature_transparent-200x59-20190805.png

R Blank
CEO, SYB

Want to learn how you can test EMF at home?​

Good. Grab your copy of my free guide with everything you need to know about home EMF testing.​

Want to Slash Your Exposure to Harmful EMF?

Good. Download my FREE GUIDE with the 5 top free ways to start right now.