Medborgarinteraktion med digitalisering och analys

Ökad användning av information för att förbättra medborgarinteraktionen är ett område som flera av våra offentliga kunder runt om i världen idag arbetar med och ser stora effekter av. Exempelvis har Ministry of Social Development på Nya Zeeland sparat ca 6 miljarder kronor under en fyraårsperiod som en följd av ökad informationsanvändning. I Danmark ser vi en liknande tendens hos de offentliga institutioner som vi idag har ett tätt samarbete med. Det skapar inte bara  effektiva verksamheter, det skapar även mer dynamiska och lärande organisationer, som är proaktiva inför uttalade och outtalade önskemål och behov från befolkningen.

"By taking the same approach to data analytics that the corporate sector has been doing for decades, MSD saw a huge opportunity to learn more about who receives benefit and to make better decisions about the support and investment they need.” Paula Bennett, Minister of Social Development, Ministry of Social Development, New Zealand.

Vad är det då som gör att dessa organisationer kan skörda ekonomiska och organisatoriska fördelar, som Ministry of Social Development konstaterar i sitt uttalande?

Min erfarenhet säger att det vanligtvis handlar om en önskan om att bli en mer lärande och medborgarcentrerad organisation, där man erkänner att det krävs en mer faktabaserad styrning som utgår ifrån medborgarnas behov och situtation och där den rätta behandlingen av information är en hörnsten – inte bara för att skapa en överblick över informationen utan även för att kunna utvinna nya insikter och behandla dessa i realtid.

Det är en resa som måste påbörjas för att förbättra medborgarservicen, och det första steget är att säkra ett mycket starkt fokus från ledningens sida på konkreta problemställningar som man vill ta tag i. Man vill kunna se de här problemställningarna från ett informationsmässigt perspektiv och därefter arbeta effektivare i mindre och mer målinriktade grupper. Med konceptet SAS Citizen Intelligence försöker vi förena både de organisatoriska och infrastrukturmässiga perspektiven och därmed understödja de olika behov och krav som man möter i arbetet på det här området. Som Ministry of Social development såväl som våra nordiska kunder visar är potentialen enormt stor – och möjlig att genomföra.

Vill du veta mer om möjligheterna för att optimera medborgarinteraktionen kan du läsa mer  här. Vill du veta mer om Ministry of Social Developments arbete på området kan du läsa mer här.

Tidigare blogginlägg i ämnet:

Bli redo att hantera de digitala medborgarna
5 profetior för framtidens offentliga sektor
Fem fokusområden till bättre medborgarservice

Post a Comment

Analytics helps keep the borders open securely

During this autumn, immigration has been the topic of discussion in Finland more than ever before. The change in the security situation in Syria was perceptible even in Finland, and it was necessary to react quickly and abruptly. Rapid changes in the world situation are bound to be expected in future, too, and analytics could be very useful in preparing and reacting to them.

Analytics provides the authorities with possibilities to simulate and stress test different kinds of situations. At the same time, it makes it possible to improve and speed up border control and the automation of asylum application processing. New technology makes it possible to manage asylum seeking in a way that keeps costs at a reasonable level. This is not only a question of money, as at the same time it is possible to ensure that well-founded asylum applications are processed more efficiently and that applicants can continue their lives as soon as possible.

Analytics can speed up the operations, for example, in the processing of asylum applications and the control of border crossings. Risk-based profiling can provide a sustainable and humane solution when balancing between costs, the flow of border crossings and safety. Using analytics helps recognise high-risk travellers based on their actions and the 'red flag' targets on checklists. This will also ease the movement of legal travellers in accordance with the Schengen philosophy.  

Automation of asylum application processing

Automation will ensure particularly high cost savings. The processing of asylum applications can be automated to a significant extent, which will save the immigration authorities’ resources and make the operations more efficient. For example, in the US, the applications of immigrants are mainly handled by using analytics programmes. The system will alert the authorities about cases that require closer investigation, for example, due to security risks.

By using analytics programmes, it is possible to quickly generate an overview of, for example, how a certain number of immigrants will affect the housing or employment situation in a country or a municipality. Different kinds of simulations and stress tests can thus be used to find out how well prepared we are for a situation and how likely it is that we can manage it.  

Allocation of aid measures

Analytics is also excellent for the allocation of aid measures offered by the society. A model could be taken, for example, from New Zealand, where the local Ministry of Social Affairs and Health has successfully utilised analytics tools for transforming its social work. Through extensive data analysis, they detected that two out of three adults within the scope of the social security system had needed support as early as under the age of 20. In order to reduce dependency on social aid, the local Ministry of Social Affairs and Health decided to allocate resources to preventive work with children and young people.

In New Zealand, a decision was made to utilise the segmentation and customer approach that is typical of the corporate world. Unlike with corporations, the aim was not to increase the number of customers, but on the contrary, to reduce the number of customers who were within the scope of social security. Based on the data analysis, the greatest advantage would be achieved with preventive measures targeting teenage parents as well as children and young people in foster care. Preventive services were customised for these groups. They were offered, for example, a personal mentor, support for planning their personal finances as well as education and training services.

5 measures: how the challenges of immigration can be solved by using analytics 

  1. Recognise critical data sources and integrate the data taken from them into a format that can be analysed. This is normally the hardest stage, as it is generally believed that the modern systems contain the needed data.
  2. Define the rules that make it possible to find suspicious travellers. In the long run, this is mostly automation. The aim is to recognise problem-free cases and let them continue their journey with ease.
  3. Define what kind of behaviour is considered normal and let the data tell you what is out of the ordinary. Recognising deviations is an essential part of finding new misappropriation cases. Some individual cases will then reveal any significant deviation in comparison with control groups. By using this method, organised crime and the related main offenders can usually be found.
  4. Use predictive modelling. It utilises well-known abusive behaviour and assesses how likely it is that a person is part of organised crime.
  5. Combine data by analysing social networks. Connections between different data can be found by creating a network to which, for example, bank accounts, flight details, applications, history data, properties etc. are added.

 

See our website for more information on promoting public safety by using analytics.

Watch a video for more information on how analytics is utilised in border control.

Post a Comment

Sertifikaatti kertoo turvallisuudestamme

Pari vuotta sitten me SAS Institutella päätimme panostaa Suomessa turvallisuussektorille. Olimme toki toimineet alalla ennenkin, mutta nyt toimintaa kehittääksemme, teimme investointeja ja palkkasimme joukkoomme uusia alan huippuammattilaisia.

Kun toimimme asiakkaiden kanssa turvallisuuden parissa, aloimme yhä enemmän miettiä myös oman toimintamme turvallisuutta. Työskentelykulttuuriimme on toki aina kuulunut turvallisuusnäkökulmien tarkka huomioiminen. Silti mietimme, voisimmeko tehdä toiminnastamme asiakkaille entistä läpinäkyvämpää.

Vastauksen tarjosi ISO/IEC 27001:2013 -sertifikaatti, joka takaa turvallisuudenhallintajärjestelmämme laadun. Sertifikaatti todistaa, että toimintamme on parhaiden käytäntöjen ja standardien mukaista. Kun toimintatapamme on dokumentoitu hyvin, myös asiakkaidemme on helpompi ymmärtää, miten pidämme huolta turvallisuudestamme.
 

Sertifikaatti sisältää 114 turvallisuusvaadetta

Projekti sertifikaatin hankkimiseksi käynnistyi vuosi sitten. Oli hienoa huomata, että prosessimme olivat jo ennestään hyvällä tasolla: pystyimme viemään auditointiprosessin läpi alle vuodessa. Sertifikaatin vaatimuksiin kuuluu 114 kohtaa, joita aivan kaikkia organisaation ei tarvitse täyttää. Päätimme kuitenkin kehittää toimintaamme niin, että saimme kaikki ruksit ruksattua.

Vaikka prosessi eteni nopeasti, emme suhtautuneet muutoksiin kevyesti. Tästä kielii esimerkiksi toimitilojemme remontti, joka oli hyvin kattava. Sertifikaatissahan ei ole kyse pelkästään tietoturvallisuudesta, vaan kokonaisvaltaisesta turvallisesta toimintatavasta, joka on tuotu näkyväksi osaksi jokapäiväistä työskentelyä.

Turvallisuuden hallinta jatkuu päivittäin

Koko prosessi oli yrityksellemme loistava mahdollisuus tarkastella omaa tekemistämme. Esiin ei tullut peikkoja, mutta tämän päivän yritystoimintaan liittyy niin paljon riskejä, ettei kenellekään ole pahitteeksi päivittää prosesseja aika ajoin.

Ehkä tärkein havainto, joka seurasi auditoinnista, oli se, että ymmärsimme organisaation turvallisuuden hallinnan olevan jatkuva prosessi. Nyt kun auditointi on tehty, keskitymme raportoimaan päivittäisessä työssä havaitsemiamme poikkeamia. Pienetkin väärät signaalit pitää kerätä, jotta saamme toiminnastamme dataa. Sen perusteella voimme tulevaisuudessa tehdä vielä parempia, analyyttisiä päätöksiä.

Post a Comment

Marketing today – easier or more difficult?

Marketing has evolved hugely during the last years. This we know. One of the main reasons is the digital revolution. This we also know. The digital revolution has opened up new doors and opportunities to know our customers much better than before, but at the same time consumers are much more demanding and there a bunch of new technologies, together with data, that need to be managed. So in a sense there is now much more we should know now than before. Has this made marketing easier or more difficult than before?

There are evidently still a number of challenges to manage. In this white paper you can find a more extensive description of the situation. However, I will venture on a shorter journey on how to tackle the challenges.

The name of the white paper is “Customer Intelligence in the Era of Data-driven Marketing”.

CI BloggThe first question that comes to my mind is: is this really the era of data driven marketing? There is certainly a huge amount of data out there that could, and should, be used when doing marketing. My feeling is, still, that most of the data goes unused. Or at least is used sub-optimally.

 

The Three Biggest Marketing Challenges Today

In order for any company to be successful in the field of marketing today there are three main issues that need to be addressed:

  • How can I know my customers better?
  • What should I tell my customers to keep them happy?
  • How do I balance my budgets to keep both my customers AND my management happy?

The short answer to all of the questions above would certainly be: analyze the heck out of the data you have and use what you learn in all interactions.

The longer answer you can read in the white paper mentioned above. It includes thoughts about marketing moving from Mad Men towards Math Men, because marketing professionals today need to understand the power of data and have a certain amount of analytical capabilities.

People, Process, Technology

Also, you probably need to do something about your people, processes, and technology. You will need a strategy that supports customer analytics at all relevant stages. Strategy first, always. Many people will frown at “data-driven”, “analytics”, and “statistics”. But they shouldn’t. Analytics doesn’t have to be excessively difficult and complex. There are modern tools that make analytics and data accessible to the analytically non-savvy. And if you implement a centralized platform that provides you with a complete customer view and functionality to manage all customer interactions regardless of channel from one location, you will quite quickly see the value data-driven marketing will bring you.

And now to some buzzwords

Real-time. As is with buzzwords there is a lot of confusion and many different views of what real-time really is. Some think it is equal to inbound marketing. Some think you only apply it to digital channels. The truth is real-time is much more than that. To quote the white paper: “It is about merging real-time data with events that happened on other channels from the same customer, to have the full behavioral context, recognize him or her as one and the same and know what the best actions will be”. Sweet.

Attribution modeling. “The use of advanced analytics to allocate proportional credit to each marketing touch point across all on- and offline channels, leading to a desired customer action”. In order to really know which actions in which channels drive customer behavior towards the desired goal you need to apply some serious analytics. This is where marketing attribution helps. It uses one or several techniques to answer which lever really works best.

Back then to the original question: has marketing become easier or more difficult than before the digital revolution? To be honest – who cares? It’s certainly demanding, fun, and the playing field continues to open up new opportunities to succeed or completely fail in impressing the consumer. Both are up for grabs, and my experience has it that far too many companies still fail.

As said, this was the short version of the truth about data-driven marketing. If you want to learn more, which you should, check out this white paper.

Post a Comment

Tänk på avståndet mellan affär, IT och analys när du stiger på

Alla har vi hört om Messias ankomst. Den nya CDO:n. Nej, det är inget nytt suspekt finansiellt instrument eller en kemisk kolmolekyl - det är den nya rollen som rapporterar till CEO och som sitter i ledningsgruppen. Chief Data Officer, eller Chief Digital Officer, en länge saknad funktion i organisationen och som har till uppgift att stötta i den digitala förändringsprocessen.

"Det bästa sättet att beskriva en CDO är att det är en person som kan krossa silos och föra samman olika avdelningar"

Andy Gilman, på konsultbyrån CommCore Consulting, menar att det bästa sättet att beskriva en CDO är att det är en person som kan krossa silos och föra samman olika discipliner och avdelningar. Något svårt att attrahera sökanden med en sådan arbetsbeskrivning, men det är faktiskt precis vad det är och vad som behövs för att en traditionell affär ska anpassas till den digitala världen.

En studie har visat att 83% av de företag som varit mest framgångsrika i sin digitaliseringsprocess har rekryterat en CDO eller motsvarande funktion. Samtidigt säger Gartner att bara 20% av de större företagen har en sådan roll på plats idag. Hur många företag har digitaliseringsstrategin på agendan? Det absolut mest trendigaste ordet just nu - förmodlingen fler än 20%.

För att lyckas som CDO enligt Apigee så ska han eller hon kunna; skapa gemensamma målsättningar för företaget, utveckla en digital strategi, omfamna experiment som baseras på data, ha stora nätverk med många experter, tala till tekniker på teknikers språk, och till affären på affärens språk och förstå vikten av att koppla samman digitaliseringsprocessen med affärsmålen. En mångfacetterad roll som kräver sin man eller kvinna.

Vilken verklighet kommer då CDO:n att mötas av? Vi har väl alla våra åsikter och erfarenheter om hur samarbetet mellan verksamheten, IT avdelningen och analysenheten fungerar. Till detta ska vi nu addera leverantörerna. I vissa undantagsfall finns ett fungerande samarbete men här finns mer att önska. Hur många på IT vet vilka företagets lönsammaste produkter är? Hur många på verksamheten vet vilket IT plattform som hela företagets framtid vilar på? Hur många på analysenheten vet vilka mätetal som styr besluten? och Hur många leverantörer kan sina kunders strategiska mål på sina fem fingrar? Trodde väl det.

Än sen då, vi har ju alla våra prioriteringar, specifika kompetenser och förmågor. Men här har saker och ting gått riktigt snett. 80% av alla IT projekt misslyckas och kostnaderna för detta känner vi alla till. Orsakerna kan diskuteras men ofta tenderar det att bero på tre saker; bristande kommunikation, dålig styrning och otydligt ledarskap. Nog om det, nu tittar vi framåt och lär oss av misstagen.

När det kommer till Big Data och Analytics så består denna vetenskap i huvudsak av tre delar;

  1. Data Management (IT-avdelningen), handlar om att fånga-, tvätta-, rensa-, lagra- och berika data som kommer från verksamheten eller omvärlden.
  2. Analytics (Analysenheten), handlar om analytiska modeller även kallade  "Algoritmerna" som står för själva receptet eller formlerna för resultatet.
  3. Business Intelligence (Verksamheten), handlar om visualiseringen som gör det möjligt att förstå, realisera och använda resultatet i det dagliga arbetet.

Alla beståndsdelar behövs, alla avdelningar måste samarbeta - och för att lyckas måste arbetet orkestreras (kommuniceras, styras och ledas) av CDO:n.

CDO

CDO:n roll: världens mest intressanta jobb eller i stormens öga?

Hur kan då vi som leverantör stötta i arbetet? När det kommer till implementering av analysplattformar så har enbart vi på SAS gjort en sisådär 4000 projekt. Leverantörerna har värdefull erfarenhet som bör tas tillvara och det i ett tidigt skede. Ofta kommer ofullständiga krav som skapar fler frågor än svar vilket leder till tolkningar som varken leverantörerna eller kunderna gynnas av när den slutliga lösningen ska upphandlas och tas i produktion.

Ett angreppssätt att anamma är att börja med att förstå den analytiska mognaden i den egna organisationen; kompetenser, processer och förmågor. Något som vi på SAS kallar för Analytic Coaching och Analytical Factory. I och med detta kan vi skapa ett gemensamt språk med relevanta mätetal och definierad affärsnytta som tydligt påvisar syftet med lösningen och vilket värde den skall generera för att realisera verksamhetsmålen. Först då kan den verkliga nyttan till slutkunden eller medborgaren förverkligas, och som en positiv bieffekt - minska avståndet mellan affär, IT och analys.

Andra posts som jag skrivit om:

Post a Comment

The Internet of Things is wasted without risk-taking

The true potential of Internet of Things will never be seized if it is merely used for improving the efficiency of existing business operations. The expectation is typically that an external service provider provides IoT as a package solution for improving the efficiency of their business.

While Finnish companies have extensive experience in data collection, the Internet of Things is not only about monitoring everything. The value and significance of data can only be discovered through analysis. The tricky, and simultaneously fun, part of the process is that it is difficult to predict the result before actually collecting the data.

This is exactly what is holding back the development of IoT today. During financially challenging times, companies are weary of investing, looking for profit for every euro spent.

Productive data analysis requires an open mind. While it is undoubtedly important to improve business efficiency and utilise data for maximising profitability, the greatest innovations are typically born out of business opportunities created in completely new markets or sectors. The most lucrative jackpots are ideas that cannot be foreseen before data analysis.

Let us discuss a few examples. Elevator companies provide services to large masses of people on a daily basis, which means they possess a large amount of data on the movements of their users. This data could be utilised in planning parking facilities, developing restaurant services or ensuring security. Another example could be a crane company with a hundred active cranes operating in the middle of a large city. This gives such a company a unique perspective on the city. The company could use the cranes, for example, for collecting data for traffic control purposes. The possibilities are endless.

Or maybe there are no possibilities. After all, you can never be certain of the results. In the academic world, not many researchers are able to achieve significant results. However, the few breakthroughs make it worthwhile to finance basic research. A publisher knows that ten publications can be published at a loss, as long as there is one hit to balance the losses.

It is certain, however, that if the opportunities are not explored, many innovations are never born. Large American corporations have a lot more liquidity to spend on finding those gold nuggets than Finnish companies do. For companies of this size, investing in data analysis is a much smaller risk than for Finnish medium-sized companies. Nevertheless, we cannot be lulled into maintaining the status quo. Creating new sectors, companies, services and jobs requires risk taking.

Utilising the Internet of Things in a way that supports and creates business is not an easy feat, which is why many companies are looking outside for assistance. We need a bridge between business and IT. Advice should be heeded right from the start. Other useful sources for ideas can be found in the experiences of others and by reading articles on the topic.

Oscar Lindqvist
Business Advisor

At SAS Institute, Oscar's area of expertise pertains to the manufacturing industry. He helps companies and analytics find each other. Oscar possesses over 20 years of management experience in the manufacturing industry.

Post a Comment

Life Science Digitalization – Standardization Is Core

As promised in my first article (Transformation and convergence in the health and life science ecosystem), this is the second article in a series of five articles. This article focuses on the digitalization in Health & Life Sciences (HLS) and the necessity this has for data integration and standardization. Even though it might not be the most exhilarating topic, data integration and standardization is a crucial foundation and precondition for the digitalization in life sciences.

A peek into the future

Before elaborating on some of the ongoing data standardization initiatives in life sciences, let's put things in perspective and take a sneak peek into the future:

Sometime in the near future, each individual will have access to all his/her own personal health and medical data. Just like we have bank accounts today containing all our financial data (i.e. savings, loans, investments, etc.), sometime in the future, we will have individual ‘health accounts’ containing all our health and medical data (i.e. electronic health records, usage of medication, individual DNA and gene profile data, data from wearables, personal health data generated via health apps, etc.). This sensible data will be stored in a secure cloud, and we will have access to this data as well as grant other stakeholders (i.e. doctors, medical advisors, family members, etc.) access to (parts of or all of) this data. We will expect this data to be used proactively for preventive purposes to provide more precise medical diagnoses and counselling.

Actually, elements of the above already exist. A good example of this is The Danish e-Health Portal (Sundhed.dk) which enables all Danish citizens to access their personal health data (EHRs, electronic medicine profile, personal medical history, organ donor registration, etc.). Expanding this portal with i.e. genetic data and the ability for each citizen to be able to upload data from their mobile eHealth apps/devices – and that all this data is actually used for preventive and more precise diagnoses and treatment – could be the next step. I presume several other countries have similar e-Health portals under development.

This type of development will accelerate as mature eHealth technologies combined with an increased macro trend of individuals take more control of their own health, treatments and general well-being. As Dr. Bertalan Mesko very well describes it in this recent McKinsey article, “Healthcare will be driven much more by consumers than physicians, with patients increasingly coming to their doctors with more information, parameters they measured at home, and an informed opinion about how they should be treated”.

As individuals (or patients), we are more well-informed. Looking at the pharmaceutical industry, we see the initial steps of ‘Data being the new pill’. As Michelle Longmire, CEO of Medable Inc., writes “…most drugs will have both a chemical and digital component, as every pill will have a companion mobile app that collects patient-specific data”. Michelle has shared this perspective in her articles ”Data as a Drug” Will Define the Next Decade of Medicine” and “Beyond the Pill: Data is the New Drug”. In an increased competitive sector and in order to get closer to the patients, life science companies are focusing on developing apps and devices to improve patients’ adherence to medication, improve the overall treatment, improve patient loyalty to the brand, and ultimately provide patients with a better combined ‘value proposition’.

Cube standardization

Think about it – if you had the option, would you say ‘no’ to more proactive health monitoring, where your individual healthcare data are integrated, standardized, analyzed and compared with millions of other individuals’ healthcare data, with the benefit of detecting serious diseases as early as possible, and based on that information be treated optimally?

The core of above digital evolution is that data from many different and diversified sources (electronic health records, data from mobile health apps, wearables and devices, etc.) must be integrated and standardized to be able to analyze and get a meaning out of it. Benchmarking and analysis are not possible without comparative (standardized) data – no use in comparing ‘apples with pears’ – and that requires some kind of standardization of data.

Moving towards increased standardization for better utilization of data

The pharmaceutical industry and regulators have been working towards standardization for decades to standardize shared information between them; primarily driven by the regulators pushing the pharmaceutical industry. Both regulators and the pharmaceutical industry want to be able to look, analyze and compare safety and efficacy across clinical trials, medicinal products and/or treatments. With payers being more cost-sensitive, and the pharmaceutical market being more competitive, this pushes for increased comparative analysis between medications/treatments to justify price and reimbursement rates. And from the general public, there is an expectation that the industry utilizes data more optimally in order to provide better and more cost-efficient medicinal products and treatments. As part of the global harmonization and standardization within life sciences, and for the purpose of this article, I will share three data standardization initiatives that currently have very high focus and are being adopted within the pharmaceutical industry:

CDISC – In drug development and clinical research, Clinical Data Interchange Standards Consortium (CDISC) is an international non-profit organization that provides clinical research data standards. The main purpose is to provide global and platform-independent data standards that will enable information systems to operate together to share information about clinical trials. Several of the data standards are now ‘de facto’ industry standards and are becoming mandatory by the regulatory authorities. This standardization within clinical research has been an ongoing endeavor during the last 15 years or more (see figure below regarding adoption of different CDISC standards). To avoid any legal problems and to maximize adoption, the standards are vendor- and platform-neutral and made freely available via the CDISC website.

ISO IDMP – The ISO Identification of Medicinal Products (ISO IDMP) is a global standard that provides the basis for the unique identification of medicinal products. The European Medicines Agency (EMA) is the first regulatory authority to mandate ISO IDMP in 2017/2018 (expected first iteration deadline), and other agencies will follow. ISO IDMP is a standardization within regulatory submission and pharmacovigilance and mandates pharmaceutical companies to integrate and standardize product data from multiple disperse internal sources (including Regulatory Affairs, Clinical/Drug Development, CMC, Product Supply/Manufacturing and Marketing). Integrating and standardizing this data to IDMP is a comprehensive data management task, and for some organizations the initial journey towards Master Data Management. For more elaboration on ISO IDMP and the related data standardization challenges, I recommend the white paper “Crossing the IDMP Data Chasm” as well as EMA’s IDMP webpage.

OMOP – With Real World Evidence (RWE) being a very hot topic for pharmaceutical companies as a way to gain new insight and revenue streams from available internal and especially external (big) data sources (patient registries like Optum, CPRD, Truven, as well as data from apps and patient wearables, etc.), this raises the demand for standardization across these observational data. This is where the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM) comes into play. This data model is increasingly seen as the ‘leading’ data model for observational and ‘Real World Data’ and is often used as a basis for the data model behind RWE analytical platforms (like SAS’ real-world data and analytics platform). One of the advantages with OMOP CDM is that it combines cost and clinical information in one data model. For more information on OMOP, please visit the OMOP website.

As an example of the data maturity and ‘interconnectivity’ of these data standards in the industry, it is interesting to see assessment work has already been made in mapping data between OMOP and CDISC’s BRIDG model. “The harmonization between OMOP’s Common Data Model and the BRIDG model will establish the link between prospective clinical research and life sciences studies and retrospective active surveillance studies utilizing observational data such as those carried out by the OMOP investigators” quoted from the poster ‘Harmonization of the OMOP Common Data Model with the BRIDG Model’. There is also harmonization work being done between CDISC and ISO IDMP, with the aim of standardizing clinical data, so less data management effort is needed to make it ready for IDMP submissions. In my opinion, these efforts provide some interesting perspectives and a step in the right direction in utilizing the data more optimally. Perhaps a recommendation for all e-Health App and wearable developers would be, to take a peek at these data standards when developing new apps/wearables, to ensure that the data generated by the e-Health Apps/wearables can be efficiently integrated and used in a broader context.

A common denominator for all mentioned standards is that they all require extensive change management efforts, comprehensive data management expertise and technology, as well as resources to be adopted, maintained and used appropriately in the organizations. Pharmaceutical companies can spend up to several years to adopt just one CDISC data standard into their end-to-end clinical processes. The same is foreseen regarding ISO IDMP; an implementation phase will take 9-18 months and is followed by a maintenance/operational phase to ensure regulatory compliance. Working in SAS, I’m proud to be part of a software company with strong domain expertise and data management solutions for the three highlighted data standards.

Undoubtedly, Health & Life Sciences (HLS) is transforming and becoming more digital. In an era where pharmaceutical companies are taking experimental and the first initial digital steps to ‘get closer’ to their patients (i.e. with mobile eHealth apps), and where regulators and patients demand increased openness and transparency, the underlying and implicit precondition to all this is standardized data. In this article, I have highlighted three important data standards that are currently being adopted, and their importance and impact of the digitalization in life sciences. If we agree that ‘data is the new pill’ and that it fuels the digitalization of life sciences, the ability to efficiently standardize and integrate the data for analytical purposes and regulatory submission is foundational and a core capability. I strongly believe that what we see now is the early contours of the next data wave in life sciences.

(This was the second article in a series of five articles)

Post a Comment

Fem fokusområden till bättre medborgarservice

I min dialog med olika myndigheter upplever jag ett ökat behov av att arbeta med processer runt medborgarinteraktion. Syftet är att skapa ökat förtroende och ökad medborgarservice men samtidigt att bli mer effektiva.

En av de största utmaningarna är att man som organisation ska säkra att de interna resurserna prioriteras på rätt sätt. Målet är naturligtvis att de olika medborgarbehoven ska tillgodoses, oavsett vilken kanal medborgaren använder sig av och när det sker. Eftersom mängden av möjliga kontaktytor har ökat, har även komplexiteten ökat och det finns därför behov av mer analytiskt och automatiserat beslutsstöd.

Dåligt planerad interaktion skapar resursspill
Och vad är faran om man inte optimerar och automatiserar? Ett exempel på dåligt använda resurser kan exempelvis vara den extra (och onödiga) interaktion som uppstår mellan en medborgare och en handläggare för att medborgaren inte kan fylla i en digital blankett korrekt. Det kan vara små saker som räknas: medborgaren kan exempelvis bli osäker om han/hon inte får en bekräftelse när blanketten är ifylld – och vänder sig därför till handläggaren eller kundsupporten igen. Ett annat klassiskt exempel är att medborgaren inte hittar rätt på webbplatsen och istället ringer upp den hårt belastade kundsupporten.

Många organisationer har därför börjat arbeta målinriktat med att effektivisera utifrån det lösningskoncept som vi kallar Citizen Intelligence: Att utnyttja insikter med hjälp av automatiserade processer för att på så sätt sätta den enskilde medborgaren i centrum.

Strategin är ofta centrerad omkring följande fem fokusområden:

Fokusområde 1: Ledningens godkännande
Förbättrad medborgarservice är ett initiativ som inkluderar de flesta delarna av organisationen. Det är därför viktigt att skapa förståelse och uppbackning från ledningen innan man kan påbörja arbetet med att skapa en gemensam vision. Arbetet omfattar en identifiering av nuvarande processer och interaktioner samt utarbetande av en plan och en road map.

Fokusområde 2: Insamling och insikt
Insikt skapas utifrån all den information som finns runt den enskilda medborgaren. Det är viktigt att skapa en överblick över hur den här insikten används – och var det finns behov för ytterligare insikt för att kunna ge bästa möjliga service till medborgarna. När systemet till slut är på plats, får organisationen kontroll över alla svar och reaktioner genom att följa aktiviteterna hos de personer man kommunicerar med.

Fokusområde 3: Effektiv styrning
Möjligheterna att effektivisera är ett naturligt fokus när ni ser över de nuvarande processerna. Målet är att låta systemet hantera och optimera själva kommunikationsflödet, eller logistiken skulle man kunna säga, så att de mänskliga resurserna kan användas till att fokusera på innehållet i ärendehanteringen. Ni kan därmed prioritera era resurser och sätta in dem där behoven eller riskerna är som störst.

Fokusområde 4: En sammanhängande medborgarupplevelse
Den totala medborgarupplevelsen i en specifik livssituation består ofta av en lång rad kontaktytor inte bara hos en utan flera myndigheter. Det är viktigt att skapa en överblick över alla de möjliga kontaktytor som medborgaren har med organisationen och även titta på hur de kan förbättras. Kontaktytorna kan vid vissa tillfällen förbättras med ett automatiskt beslutsstöd, och det kan samtidigt vara en fördel att minska antalet eller till och med introducera nya.

Fokusområde 5: Bättre och individuell medborgarservice
Det är viktigt att få insikt i vad som gör medborgarna nöjda med organisationens tjänster eller information. De här insikterna kan inarbetas i planeringen och prioriteringen av bättre medborgarupplevelser. Med Citizen Intelligence utnyttjar du systemets information om medborgarna och företag till att göra din kommunikation effektiv och målinriktad. Servicen blir bättre eftersom du kan kontakta målgruppen vid rätt tidpunkt med rätt budskap.

Använd ALL relevant information och ge bästa möjliga service till den enskilde medborgaren
Organisationer måste kunna agera både på den historiska (samlade) insikten om varje enskild medborgare och deras aktuella beteende och behov – exempelvis vad de gör på webbplatsen. Det kräver att organisationen har en data- och analysplattform som säkrar att alla interaktionspunkter är uppdaterade med relevant insikt för att fatta rätt beslut om vad den bästa handlingen är för just den medborgaren vid just den tidpunkten. Automatisering är nödvändigt eftersom medborgaren idag rör sig via flera olika interaktionspunkter som webbplatser, mobilappar, e-post, telefon och chatt.

Det är dags att vi använder kraften i den data vi skapat och skapar för att ge den mest optimerade individuella service som idag är möjlig.

Om du vill ha mer information kan du läsa mer här.


Tidigare blogginlägg i ämnet:

Bli redo att hantera de digitala medborgarna
5 profetior för framtidens offentliga sektor

Post a Comment

The era of smart insurance is dawning

What if a reckless driver adopted a more responsible approach because the car insurance pricing was based on driving habits? What if the senior from next door had the insurance payments based on kilometres driven, resulting in significant savings? This may be reality sooner than you think.

The Internet of Things will revolutionise the insurance business to the benefit of customers and insurers alike. From an insurance perspective, IoT is based on sensors that collect information on risk items, conveying the data to the insurance company. By analysing the elicited data, the insurer is able to make assumptions that allow for a more accurate evaluation of risks. In some cases the sensors can help prevent accidents, or, at the very least, minimise their impact. The use of IoT encourages customers to adopt patterns of behaviour that are conducive to health and well-being. This may also lead to positive changes in attitudes. In other words, the implementation of smart insurances can benefit the entire society.

Risk-free driving habits bring insurance payments down

I would forecast that the car insurance sector will be among the first to see major changes. Octo Telematics, an Italian partner of SAS Institute, is already offering alternative, sensor data-based insurance models for insurance companies in several countries. For example, the Octo Box, which is installed into a vehicle, can measure mileage, speed, accelerations and deceleration, even lane switches. This allows the insurer to apply pricing based on mileage-based or a wider range of data on driving habits. Drivers adopting a safer, proactive approach may achieve savings on their insurance payments in contrast to their neighbour with less safe and economical driving habits.

In England, a large portion of car insurances are so-called user-based insurances, typically based on mileage. Currently in Finland, car insurance pricing is based on only the features of the car and the owner's accumulated bonus points, or years without accidents. The customer is forced to pay the same amount regardless of the car's actual usage.

The new pricing model can provide a competitive edge to the insurer in a highly competitive business environment. The technical prerequisites for the change are already there. Many modern car models are already equipped with sensors, or the possibility of installing sensors. The insurance company only needs the tools to collect and evaluate data and for risk modelling. Our mission at SAS Institute is to assist in analytics.

Proactive approach reduces water damages

Sensors are making their way into domestic and business environments. I myself installed a simple sensor system in my house a year ago. The system monitors waterflow within the piping, stopping the flow automatically once water has flowed continuously for a predetermined time. For example, if a fault occurs in the laundry or dish washing machine, this safety feature can significantly reduce any damages. Once the insurance company found out about the precaution, my insurance payments were reduced.

It would also be possible to adopt a more systematic approach and transfer the data from the sensors directly to the insurance company's database. In addition to water pipes, households are filled with areas that could be monitored with sensors in order to collect data for evaluating and preventing damages. For example, voltage spikes are known to cause short-circuits in domestic electrical systems, occasionally leading to fire damage. Sensors could help identify faulty household appliances before the problem is too late to address. In the future, sensors could conceivably be used to measure indoor air quality. The technique could contribute to the well-being of the residents by enabling the early detection of any changes in the air quality with potential impact on health.

In terms of property insurance, IoT will allow insurers to transition from reactive damage assessment and compensation to a proactive approach that facilitates damage prevention. I firmly believe that this is the smartest direction to take for residents, owners and insurance companies.

Sensors detect well-being and related risks

Many of us are already using active life trackers, or sliding a sleep analyser sensor underneath the mattress. A recent smart life insurance project piloted by LähiTapiola found that around 80 percent out of 2,000 participants improved their habits with an electronic health inspection, self-coaching programme and active life tracker incorporated into their insurance. In other words, IoT is encouraging proactive approach with risk elimination even in the life insurance business. Our habits contribute at least as much as our genes and environmental factors to the risk of falling ill.

By agreeing to use an active life tracker, the customer provides the insurance company valuable health data for risk assessment. With less pressure on compensation matters, the company is able to offer reduced insurance payments or other benefits. Predictive modelling can help determine the best possible customer experience.

In Finland we are highly capable of implementing new IoT-based techniques due to our familiarity with mobile technologies and the use of smart phones. Smart insurances are only a small step away. The change can be kicked off very quickly. The technical prerequisites are already in place. Now it is up to us to make bold decisions.

Interested in knowing more? Visit our pages with more information on SAS Institute solution models for insurance companies.

 

Post a Comment

SAS Analytic Coaching Program

SAS Analytic Coaching

Analytic coaching

For more information, please contact me:
Email | mathias.arbman@sas.com
Phone | +46 72 564 8009

View file: Analytics Coaching-program-overview.pdf

Post a Comment