Significance of Data ethics in healthcare

Is medicine ready for artificial intelligence? | ETH Zurich

Over the past few years, Facebook has been in several media storms concerning the way user data is processed. The problem is not that Facebook has stored and aggregated huge amounts of data. The problem is how the company has used and, especially, shared the data in its ecosystem — sometimes without formal consent or by long and difficult-to-understand user agreements.

Having secure access to large amounts of data is crucial if we are to leverage the opportunities of new technologies like artificial intelligence and machine learning. This is particularly true in healthcare, where the ability to leverage real-world data from multiple sources — claims, electronic health records and other patient-specific information — can revolutionize decision-making processes across the healthcare ecosystem.

Healthcare organizations are eager to tap into patient healthcare data to get actionable insights that can help track compliance, determine outcomes with greater certainty and personalize patient care. Life sciences companies can use anonymized patient data to improve drug development — real-world evidence is advancing opportunities to improve outcomes and expand on research into new therapies. But with this ability comes an even greater need to ensure that patients’ data is safeguarded.

Trust — a crucial commodity

The data economy of the future is based on one crucial premise: trust. I, as a citizen or consumer, need to trust that you will handle my data safely and protect my privacy. I need to trust that you will not gather more data than I have authorized. And finally, I have to trust that you will use the data only for the agreed-upon purposes. If you consciously or even inadvertently break our mutual understanding, you will lose my loyalty and perhaps even the most valuable commodity — access to all my personal data.

Unfortunately, the Facebook case is not unique. Breaches of the European Union’s General Data Protection Regulation (GDPR) leading to huge fines are reported almost daily. What’s more, the continual breaches and noncompliance are affecting the credibility of and trust in software vendors. It’s not surprising that citizens don’t trust companies and public institutions to handle their personal data properly.

The challenge is to embrace new technology while at the same time acting as a digitally responsible society. Evangelizing new technology and preaching only the positive elements are not the way forward. As a society we must make sure that privacy, security, and ethical and moral elements go hand in hand with technology adoption. This social maturity curve might now follow Moore’s law about the extremely rapid growth of computing power, which means that — regardless of whether society has adapted — digital advancement will prevail.  But we can’t simply have conversations that preach the value of new technology without addressing how it will impact us as a community or as citizens.

Trust is a crucial commodity, and ensuring that trust means demonstrating an ethical approach to the collection, storage and handling of data. If users don’t trust that their data will be processed in keeping with current privacy legislation, the opportunities to leverage large amounts of data to advance important goals — such as real-world data to improve healthcare outcomes or to advance research in drug development — will not be realized. Consumers will quickly turn their backs on vendors and solutions they do not trust — and for good reason!

Rigorous approach to privacy

Health Data Privacy: Updating HIPAA to match today's technology challenges - Science in the NewsEthics and trust have become new prerequisites for technology providers trying to create a competitive advantage in the digital industry, and only the most ethical companies will succeed. Governments, vendors and others in the data industry must take a rigorous approach to security and privacy to ensure that trust. And healthcare and other organizations looking to work with software vendors and service providers must consider their choices carefully. Key considerations when acquiring digital solutions include:

  • How should I evaluate future vendors when it comes to security and data ethics?
  • How can I use existing data in new contexts, and what will a roadmap toward new data-based solutions look like? How will my legacy applications fit into this new strategy?
  • How will data ethics and security be reflected in my digital products, and how should access to data be managed?
  • How can I ensure I am engaging with a vendor that understands not only its products but can also handle managed security services or other cyber security and privacy requirements before any breach occurs?

Using technology to create an advantage is no longer about collecting and storing data; it’s about how to handle the data and understand the impact that data solutions will have on our society. In healthcare — where consumers expect their data to be used to help them in their journey to good health and wellness — that’s especially true. Healthcare organizations need to demonstrate that they have consumers’ safety, security and well-being at the heart of everything they do.

Data Centric Architecture

Data Centric architecture

The value proposition of global systems integrators (GSIs) has changed remarkably in the last 10 years. By 2010, it was the waning days of the so-called “your mess for less” (YMFL) business model. GSIs would essentially purchase and run a company’s IT shop and deliver value through right-shoring (moving labor to low cost places), leveraging supply chain economies of scale and, to a lesser degree, automation.

This model had been delivering value to the industry since the ‘90s but was nearing its asymptotic conclusion. To continue achieving the cost savings and value improvements that customers were demanding, GSIs had to add to their repertoire. They had to define, understand, engage and deliver in the digital transformation business. Today, I am focusing on the value GSIs offer by concentrating on their client’s data, rather than being fixated on the boxes or cloud where data resides.

In the YMFL business, the GSIs could zero in on the cheapest, performance compliant disk or cloud to house sets of applications, logs, analytics and backup data. The data sets were created and used by and for their corresponding purpose. Often, they were tenuously managed by sophisticated middleware and applications for other purposes, like decision support or analytics.

Getting a centralized view of the customer was difficult, if not impossible. First, it was due to the stove piping of the relevant data in an application-centric architecture. In tandem, data islands were created for analytics repositories.

Now enters the “Data Centric Architecture.” Transformation to a data-centric view is a new opportunity for GSIs to remain relevant and add value to customer’s infrastructures. It is a layer deeper than moving to cloud or migrating to the latest, faster, smaller boxes.

A great way to help jump start this transformation is by rolling out Data as a Service offerings. Rather than taking the more traditional Storage as a Service or Backup as a Service approach, Data as a Service anticipates and provides the underlying architecture to support a data-centric strategy.

It is first and foremost a repository for collected and aggregated data that is independent of application sources. From this repository, you can draw correlations, statistics, visualizations and advanced analytical insights that are impossible when dealing with islands of data managed independently.

It is more than the repository of the algorithmically derived data lake. A Data as a Service approach provides cost effective accessibility, performance, security and resilience – aimed at addressing the largest source of both complexity and cost in the landscape.

Data as a Service helps achieve these goals by minimizing, simplifying and reducing the data and its movement within and outside of the enterprise and cloud environments. This is achieved around four primary use cases, which range from enterprise storage to backup and long-term retention:

 

 

Each of the cases illustrates the underlying capabilities necessary to cost effectively support the move to a data-centric architecture. Combined with a “never migrate or refresh again” evergreen approach, GSIs can focus on maximizing value in the stack of offerings. This approach is revolutionary.  In past, there was merely a focus on the refresh of aging boxes, or the specifications of a particular cloud service, or the infrastructure supporting a particular application. Today, GSIs can focus on the treasured asset in their customer’s IT — their data

Employing Automation to test Data Interface

Say, you got a DB comprising of huge data with billions of records. You have to showcase it on UI only after making sure that everything you want to represent on UI is accurate and as expected. Incorrect data could impact your business in unknown and serious ways that can lie undetected for months.So, here you might need to plan a new strategy, which should lead to answers of all your questions.One of the finest approach among this strategy should be- to make sure that everything you are showing is validated and verified. This leads to a special type of testing called as Data Interface Testing.

What is Data Interface Testing ??

What is Interface Testing? Types & Example

Before we go ahead with Data Interface Testing, let’s first discuss about data interface. Lots of application in the market are nowadays based on Data Mining or Big Data concept. This helps to streamline the big data and showcase on UI in an adequate manner.

Now, as many people say that there is always some pros and cons for each process. Similarly, even this one has few. One of the biggest one is showcasing huge data. But I have a solution.

There is always a challenge to show the huge data on UI where everything is placed at their respective place with correct data set and correct orientation (if you’re showing the data in graphical representation).

 

So, the interaction between database and User Interface brings the term Data Interface. And to make sure that it works well both ways i.e. request and response results, we call it as Data Interface testing.

Big Question !! 

Can I test this much of data and all of that manually??

Answer is yes, it is possible. But practically not a good way to do the same.

So what …?? Automation ??

Yes.

But what if I don’t have any good knowledge for it ?

Don’t you worry, we got a cheat for you !!!  A tool to test data interface automatically, with a very basic knowledge for Automation/coding.

Automation Tool

Some Info Regarding this tool
This tool is made to test validation and verification of data between database and user interface. To make this tool useful, one can easily use it on it’s own working environment, by customizing the details in provided file and coding as per their requirements. .

How this tool works ?

With it’s main class, it reads multiple files which further executes the methods written in those properties.

For reference, the source code is mentioned below:

Representing Main Class of Tool

Method written in above class is dependent upon various files. One of them is called as Property_Reader file.

This is a custom made file, which executes multiple methods and brings result for main class.

1. Property_Reader_Method

2. db_property

This file comprises of all properties, which helps in setting up connection with server/DB.

Following are the properties used in this file.

  1. Url=jdbc:presto://10.0.11.198:8080/test/default
  2. UserName=root
  3. Password= 12345
  4. ClassName=com.facebook.presto.jdbc.PrestoDriver
  • You can change your URL and credentials as per available server(s).
  • Password can be null too. Depends upon the server details.
  • For current we are using presto as DB.

3. query_column

This comprises of column name for which data needs to be fetched from DB. For every query there should be a unique query name which must be identical in all property files for that query.

For below mentioned example. “testA_count” is the name of query which is unique from rest of the two but same in other property files for queries with same conditions.

Apart from that, irrespective of number of columns available in expected and actual query, it will only bring data for “Column A” column in the result set.

Same case for others too.

4. query_actual

This property file contains the queries created by developers or fetched through server logs file which are created while accessing the application through UI.

5. query_expected

This property file contains the queries created by testers.

By running the above mentioned code for main class, it will create a new result file every time. This file will comprise of end result for executed queries, having expected and actual result with numbers and pass/fail result.

For Failed Case:

Let’s change the actual query to-

Points to be considered:

  • Make sure that the query name should always be the same in all expected, actual and column property file.
  • For every query there should be a unique query name.

Benefits of using this tool are:

  • Any Structured DB can be used for this eg. Presto, MySQL, MS-SQL etc.
  • Platform independent. Can be run on Windows/Ubuntu/Linux.
  • Can be run on a project written in any language.
  • Doesn’t require any prior coding skills or automation knowledge.
  • One can easily put the expected and actual test case in respective property files and can have the result set, with complete information.
  • Can be easily customized as per available resources/requirements

 

Biggest cases of data breach in the first quarter 2019

Cybercrime cases rose between 2014 and 2017: Ravi Shankar Prasad | Business Standard News

2019 has been a good year; not for many; but definitely for cyber-criminals. While we might still be coping-up with the news of a data breach incident that would have occurred two days ago; we hear another case of an organization’s infrastructure being breached. Let’s hit the ground zero.

1. Google Chrome cast Hack

How to Fix the ERR_CONNECTION_REFUSED Error in Chrome (9 Tips)

It was almost a normal day of the winters of January, when thousands of people who had been using Google’s Chromecast streaming dongles, Google Home smart speakers as well smart TVs with built-in Chromecast technology; got their systems hacked. Hackers left a display pop-up to inform users that their systems are exposed to public internet. However, the odd thing about this hack was that attackers forced people to subscribe to the YouTube sensation ‘Pewdiepie’.

2. Germany’s Biggest Cyber attack

Twitter down, suffers worldwide outage - BNO News

Around the same time, Germany was hit by the biggest cyber-attack in its history. Hackers hacked into the twitter accounts of more than hundred German politicians and accessed their highly sensitive personal information including email addresses, phone numbers, private chats, photographs of victims’ ID, bills as well as the credit card information. Attackers leaked the data on a twitter account called ‘@_0rbit’. German federal police dived into an investigation and soon, a 20-year old local student was arrested.

3. Ethereum Classic lost $ 1.1 million to hackers

Ethereum Price Forecast: ETH prints bullish pattern, preparing for colossal upswing past $4,000

While German police was celebrating its victory, popular cryptocurrency exchange Coinbase Ethereum Classic, experienced one of the worst days in its history. People who were using its services were forced to pay twice the coins for any of its services. This resulted in the loss of around $ 1.1 million Ethereum Classic digital currency. This resulted in the immediate fall in the prices of the digital currency. Hours later, Ethereum Classic accepted that there were almost ‘51% successful attacks’ with multiple block reorganization. Attackers are still under the cover and Ethereum Classic is still investigating.

4. Australian Parliament Cyber-attack

10 Different Types of Cyber Attacks & How To Recognize Them | InfoSec Insights

In the beginning of February, Australian parliament faced one of the biggest cyber attacks with its server being hacked by what the Australian parliament referred to as ‘the work of a sophisticated state actor’.

5. Leaked Database of Chinese citizen found online

fyi... the linkedin password database is now on pastebin : sysadmin

In January 2019, cybersecurity experts discovered a huge unsecured database worth 854.8 GB; lying openly on the internet. The database  was stored on an instance of MongoDB and consisted records of approximately 202 million Chinese citizens who were apparently job candidates. Soon the database was taken off, however, MongoDB has displayed the list of dozen of IP addresses that have accessed this database.

6. Wiping out VFEmail.net 

VFEmail Review 2021 | How secure is VFEmail? - ProPrivacy.com

U.S. based email service ‘VFEmail.net’ informed its users that all their data as well as backup worth two decades of data was lost. It was discovered that the attacker’s IP address was 94[.]155[.]49[.]9 and the username was “aktv,”, apparently registered in Bulgaria.

7. Attackers were selling the information on dark web

Information products: how to create and sell info products online

In one of the shocking instances, it was revealed that attackers were selling information of approximately 747 million accounts on the dark web. These accounts were stolen from 24 very popular websites. Most of these websites had no idea that they were compromised with, however, a few have confirmed that they suffered from data breach.

8. Indane gas breach

India's state owned gas company Indane exposes millions of confidential Aadhaar numbers

LPG gas company, Indane, became the victim of yet another case of data breach, where Aadhar number of approximately 6.7 million customers were leaked.

9. Aadhar details leaked 

Aadhaar: 'Leak' in world's biggest database worries Indians - BBC News

MongoDB is once again the talk of the town. A database known as GNCTD worth 4.1 GB in size, has been found on MongoDB instance. The database consisted of approximately 458,388 individuals’ Aadhar and Voter ID numbers along with references as well as email addresses with “transerve.com” domain for users who were registered with “super admin” and “senior supervisor” designations.

10. 1 million ASUS systems affected by massive supply chain attack

WordPress Supply Chain Attacks: An Emerging Threat

Taiwan based world’s fifth largest PC maker, ASUS, revealed that approximately 1 million systems were affected by massive supply chain attack known as ShadowHammer.

11. Bithumb suffers the loss of $19 million

Top 5 Cryptocurrency Exchange Bithumb Suffers Major Data Breach | Digital Trends

On March 30th, the news of a humungous $19 million theft from the South Korean, Bithumb cryptocurrency exchange, fell into the ears of people. Hackers had compromised Bithumb’s hot EOS as well as XRP wallets and transferred approximately 3 million EOS (~ $13 million) and 20 million XRP (~ $6 million) to the newly-created accounts.

12. Georgia Institute of Technology suffers data breach

Georgia Institute of Technology: William Smith talks about his job as director of Georgia Tech's Office of Emergency Management and Communications – India Education | Latest Education News India | Global Educational

Georgia Institute of Technology was hit badly by cyber-criminals when a data breach led to the theft of the personal information of around 1.3 million current as well as formal faculty members, student as well as the applicants. According to the university, outside entities gained access to the web application of the university’s database.

What is the reason behind the success of these attacks?

The first quarter of the year has seen a number of data breaches that have targeted big organizations. Attackers are learning, adapting as well molding their modus operandi with the changing time. On the other hand, organization are still being old school.

Procrastination:

Procrastination Isn't a Time Management Problem, It's an Emotional Problem

2019 took a start with Google Chromecast devices being hacked. This happened because a group of attackers exploited a bug that was lying down for five years like a ticking time bomb. Evidently, Google was aware of this vulnerability but kept on ignoring the bug.

Being Ignorant to the details:

The pleasures of being ignorant

In most of the cases, organizations are unaware of the fact that they are undergoing a cyber-attack. ASUS is one such victim since the attack was ongoing during the second half of 2018 and the company had no clue.

Lack of proper cyber-security measures:

How the COVID-19 Pandemic is Impacting Cyber Security Worldwide - IEEE Innovation at Work

Many a times, the data travelling in the forms of packets is not well encrypted and thus data can be easily stolen away by attackers. Indane Gas was victimized because of a vulnerability that was present in its mobile application.

What should organizations do in order to safeguard themselves?

29,165 Safeguard Stock Vector Illustration and Royalty Free Safeguard Clipart

Organizations can employ preventive cyber-security measures in order to safeguard data security and ensure that the network as well as the infrastructure of the organization is free from vulnerabilities and loopholes. Cyber-security companies ensure the same with a number of managed security services such as vulnerability assessment and penetration testing, web application testing, network penetration testing, server security testing etc. Anteelo is one of the fastest growing cyber security start-ups in the country. With its team of expert pen testers, the company has provided managed services to a number of businesses to industries like Healthcare, banking, insurance etc. These services have enabled organizations to conduct businesses without worrying about various issues related to the cyber security of the organization.

error: Content is protected !!