5 toimenpidettä: näin puolustaudut kyberuhkia vastaan

Ratkaisu piilee useimmissa tapauksissa datan ja sellaisen menetelmien hyödyntämisessä, jotka auttavat organisaatiota ymmärtämään, ketkä ovat pahoja pelureita. Menetelmien tulee nopeasti paljastaa haitallinen käyttäytyminen, jotta organisaatio voi aloittaa varhaiset, tehokkaat toimenpiteet. Korkeampien palomuurien ja useampien anti-virus -ohjelmistojen ostaminen ei enää toimi.

Sen sijaan organisaatioiden täytyy:

  1. Ymmärtää kyberrikollisten alkuperä ja käyttäytyminen, ja rakentaa ymmärryksen pohjalta malleja, jotka hakevat poikkeamia.
  2. Seuloa organisaatioon sisään virtaava valtava määrä tietoa reaaliajassa ja saada kiinni uhat ennen kuin ne aiheuttavat vahinkoa. Esimerkiksi yhdysvaltalainen vähittäiskauppaketju Home Depot vahvisti, että kyberhyökkäyksessa vietiin n. 56 miljoona maksukorttia. Tietomurron uskotaan tapahtuneen useiden kuukausien aikana. Se havaittiin vasta, kun useat pankit raportoivat luotto- ja maksukorttien väärinkäytöstä, jotka tehtiin yhtiön myymälöistä.
  3. Käyttää edistyksellistä analytiikkaa vähentääkseen vääriä hälytyksiä ja tunnistaakseen todellisia ongelmia. Esimerkiksi vähittäiskauppa Targetilla oli teknologia, joka hälytti etukäteen vuoden 2013 luottokorttitietomurrosta. Kuitenkin hakkerit pääsivät tietoihin käsiksi käyttäen hyväkseen tunnuksia, jotka oli varastettu vähittaiskauppiaalla työskennelleeltä ilmastointialan urakoitsijalta. Tämä hälytys oli kuitenkin vain yksi monien hälytysten joukossa, ja se luokiteltiin virheellisesti. ”Operaatiokeskuksen henkilökunta ei tiennyt mistä aloittaa, tai mitkä olivat todellisia uhkia,” yrityksen edustaja totesi.
  4. Integroida kyberturvallisuus perusriskimatriisiin yhdessä muiden perusliiketoimintavaatimusten kanssa. Se ei voi enää olla jälkikäteinen heräte yritykselle.
  5. Panostaa nykyistä enemmän oikeiden henkilöiden rekrytointiin ja kouluttaa heitä osallistumaan uuteen organisaatiouhkaan ja sen torjuntaan. On myös tärkeää löytää sopiva ulkoinen kyberturvallisuusteknologia ja sen harjoittajat. Markkinoilla on paljon asiantuntijoita, jotka puhuvat vain tekniikasta, mutta jättävät organisaation ilman suojaa ja mahdollisesti myös pulaan sen jälkeen, kun hyökkäys on tapahtunut. Yritykset tarvitsevat erityisesti ihmisiä, jotka ymmärtävät kyberrikollisten ja hakkereiden käyttäytymismallit, eivät ainoastaan ratkaisun tekniikkaa.

Aiheesta löydät paljon lisää mielenkiintosta luettavaa kyberturvaratkaisusivuiltamme.

Post a Comment

Internet of Jääkaappi – onko kyberrosvo nälkäinen

Miltä tuntuisi huomata, että aamulla uusi internetin kautta ohjattava jääkaappisi on täynnä vettä, maidot lämpiminä ja voi härskiintynyt. Tuleeko ensimmäisenä mieleen, että hakkeri on saanut jääkaappiisi yhteyden ja laittanut automaattisulatuksen päälle. Tämä on vielä pientä ja menee kotivakuutuksen piikkiin. Parin kuukauden kuluttua kuitenkin huomataan, että yrityksesi tärkeimmän uutuustuotteen huippusalaiset speksit ovat vuotaneet maailmalle. Paikalle kutsutaan maan kovin tietoturvatalo selvittämään ongelmaa. Pitkän selvityksen jälkeen huomataan, että erään työntekijän kone oli kytketty kiinalaisen IP-pohjaisen jääkaapin ohjausjärjestelmään. Hakkeri oli murtautunut yrityksen järjestelmiin heikoimman lenkin kautta.

Asioiden Internet (Internet of Things - IOT) on yksi aikamme merkittävimpiä teknologioita, joka muuttaa yritysten ja kansalaisten toimintatapoja. Monta toimialaa tulee vielä muuttumaan ja varmasti katoamaan. Uuden tekniikan avulla luotavat tuottavuusedut eivät jätä monellekaan yritykselle valinnanvaraa. On muuntauduttava, jos haluaa olla mukana markkinoilla. IOT tuo kuitenkin mukanaan myös uuden uhan, joka on jo ollut olemassa jonkun aikaa. Kun monet fyysiset laitteet liitetään osaksi tietokoneverkkoja, syntyy lisää uusia kyberuhkia, vielä entistenkin lisäksi.

Yksityisyys ei ole tärkeä asia ainoastaan IOT:ssa, vaan kaikissa sovelluksissa, laitteissa ja järjestelmissä, joissa jaetaan tietoa IOT:n kautta. Hakkerit voivat saada tietoa julkisten lähteiden lisäksi autoista, älypuhelimista, kodin automaatiojärjestelmistä ja jopa jääkaapeista. Ongelmana erityisesti IOT-laitteissa on niiden haavoittuvuus. Hewlett-Packardin tekemän tutkimuksen mukaan jopa 70 % näistä laitteista sisältää vakavia haavoittuvuuksia. On kiistatonta näyttöä siitä, että riippuvuutemme näiden laitteiden yhteydestä heikentää kykyämme turvata tietojamme.

Näyttää siltä, että jokainen viikko tuo mukanaan uuden korkean profiilin kybermurron – ja edes Valkoinen talo ei ole immuuni niille.

Kyberrikollisuuden kasvulla ei ole rajoja. Ennen kuin voimme suojata tietojamme, meidän on tiedettävä, missä arkaluonteinen tieto sijaitsee.  Monissa organisaatioissa tämä on vielä matalan prioriteettien asioiden listalla. Verkkoon liitettyjen laitteiden määrä ja vaihtelevuus, ja lisäksi kolmansien osapuolten integraatiot antavat hakkereille kuitenkin enemmän sisääntuloportteja kuin koskaan aikaisemmin. Organisaatiot ovat yhtä turvallisia kuin niiden heikoin lenkki. Emme voi myöskään aliarvioida inhimillisiä virheitä ja huolimattomuutta.

Sisäiset uhat ovat tutkimusten mukaan yleisin tapa päästä kiinni arkaluontoiseen tietoon. Hyökkääjillä on monia tapoja saada käsiinsä tunnukset. Jos joku tapa ei toimi, niin he yrittävät toista – tavat myös kehittyvät jatkuvasti. Yhä useammin nähdään, että kyberrikolliset hyödyntävät suoraan organisaation sisäpiiriä, jotta he saisivat tietoa verkon puolustuskyvystä ja arkaluontoisen tiedon sijainnista.

Palataan vielä IOT:n hyötyihin. Karkeasti voisi todeta, että se tuo monille aloille isoja tuottavuushyötyjä. Avainalueet IOT:n hyödyntämisessä ovat mm.:

  • Smart Life: Ratkaisut, jotka tekevät elämästä helpomman ja turvallisemman kuluttajalle
  • Smart Mobility: Reaaliaikainen reitinhallinta, joka tekee matkustamisesta nautittavamman ja kuljetuksesta luotettavamman
  • Smart City: Innovaatiot, jotka parantavat elämän laatua ja lisäävät turvallisuutta kaupungeissa.
  • Smart Manufacturing: Tehdas- ja logistiikkaratkaisut, joilla pystytään erityisesti optimoimaan prosesseja, ohjausta ja laatua.

IOT:n aiheuttamat kyberturvallisuushaasteet ovat seurausta uuden teknologian synnyttämistä uhista ja organisaatioiden kypsyydestä hallita siitä syntyviä riskejä.

Haluan tässsä luetella tärkeimmät tekijät, jotka aiheuttavat haasteita:

  • Muutoksen nopeus: Talouskriisin jälkeisessä maailmassa yritysten liikkeet ovat nopeita. Uudet tuotteet, fuusiot, yrityskaupat ja uuden teknologian hyväksikäyttö ovat kasvaneet
  • Verkkojen verkko: Internetin kehitys ja sen hajautuminen eri päätelaitteisiin tuo uusia haavoittuvuuksia
  • Infrastruktuuri: Infrastruktuurin liittäminen Internetiin avaa lisää mahdollisia sisääntuloreittejä hakkereille.
  • Pilviteknologia: Pilvimaailma on edellytys IOT:lle ja samalla kuitenkin tiedon hallinta ja kontrollointi on entistä vaikeampaa. Tieto virtaa nopeasti globaalilla tasolla, kuten myös erilaiset spam-viestitkin.

Historiallisesti uhkien hallintaan on käytetty palomuureja, kryptausta ja muita perinteisiä, tai marginaalisia turvallisuusmenetelmiä. Nyt ne eivät enää riitä. Uusi aika vaatii muutakin kuin suojautumista ja tässä analytiikalla on iso rooli. Turvallisuusstrategiassa perussuojautumisella ratkaistaan kapeat ja erityiset ongelmat. Ne eivät kuitenkaan integroi dataa ja eivät siten luo välttämätöntä sisältöä yritystasolla, jota vaaditaan kyberriskien ehkäisemiseksi. Se mitä tarvitaan on jotain, joka voi tallentaa tietoa välivaiheessa, kun hyökkääjä on verkossa keräämässä tietoja. Tässä kohdassa käyttäytymisen analytiikkaa verkossa seuraavan reaaliaikaisen ratkaisun merkitys korostuu.

Kun otamme tarkasteluun suurimmat ja laajimmat kybermurrot, niin voimme jälkikäteen todeta, että hyökkääjät ovat olleet organisaation verkossa useita viikkoja, jopa kuukausia. Käyttäytymisen analytiikka ei ainoastaan ymmärrä verkon transaktioita, vaan myös niiden merkityksen liiketoiminnalle – siksi se voi auttaa löytämään muodot ja liikkeet, jotka paljastavat rikollisen toiminnan.

Toisesta blogikirjoituksestani voit lukea, millä viidellä toimenpiteellä organisaatiosi voi puolustautua kyberuhkia vastaan. Aiheesta löydät myös paljon lisää mielenkiintosta luettavaa kyberturvaratkaisusivuiltamme.

Post a Comment

Digitalisaatio kaipaa vähemmän puhetta ja enemmän tekoja

Digitalisaatio lienee yksi tämän vuosikymmenen hypetetyimpiä termejä niin yritysmaailmassa kuin julkishallinnossakin. Digitalisaatio tulee -tyyliset julistukset voisi nyt jo kuitenkin vihdoin lopettaa: se tuli jo, ja nyt on aika kääriä hihat!

Kahden käden sormenikaan eivät näköjään riitä, kun ryhdyn laskemaan, kuinka monessa digitalisaatiota käsittelevässä seminaarissa olen aikaani kuluneen kymmenen vuoden aikana viettänyt. Ai niin, ja me SASillakin järjestimme aiheesta asiakastilaisuuden toukokuun lopulla. Älä käsitä väärin, asia on ehdottaman tärkeä, mutta terminä digitaalisaatio alkaa olla jo sieltä puuduttavimmasta päästä. Enkä näköjään ole ahdistuksen kanssa yksin, sillä myös Alf Rehn näyttää avautuneen digitalisaatioähkystään Forum-lehdessä.

Olen Alfin kanssa samoilla linjoilla, että tuntuu aika yllättävältä, kun vielä tänäkin vuonna digitalisaatiosta puhutaan nousevana trendinä. Tämä saa minut myös huolestuneeksi. Historia kun on osoittanut, että jos jokin asia toistuu jatkuvasti juhlapuheissa, se myös varsin usein jää sanahelinäksi.

Itse näen, että karkeasti yksinkertaistaen digitalisaatiossa on kyse jakelutapojen ja kanavavalikoiman laajenemisesta sekä mahdollisuudesta kerätä ja analysoida niiden kautta tulevaa tietoa aiempaa tehokkaammin. Itse pidän digitalisaatiota asioiden luonnollisena kehityksenä kohti sähköistyneempää yhteiskuntaa. Me SASilla halusimme perustaa tämän pohjoismaisen blogin, jotta voisimme riisua turhan mystiikan muotitermien, kuten teollisen internetin tai digitalisaation, ympäriltä. Pohjoismailla on liiketoiminnassa ja analytiikkakulttuurissa paljon yhtäläisyyksiä. Sen vuoksi yhdistimme blogivirran yhdeksi yhteiseksi kokonaisuudeksi, josta toivomme sinun lukijana löytävän kiinnostavia esimerkkejä ja inspiraatiota sekä meiltä Suomesta että lähinaapureista.

Kuten Alf Rehn kolumnissaan kirjoittaa, digitalisaatio on siitä metka termi, että se on samalla todella konkreettinen, mutta myös abstrakti. Lieneekö syy siinä, että moni suurikin yritys jähmettyy paikoilleen digitalisaatiota pohtiessaan? Samaan aikaan takavasemmalta viilettää ohi kymmenen hengen start up, joka onnistuu liiketoiminnan digitalisoimisessa muutamassa kuukaudessa. Toisaalta mieleeni kyllä juolahtaa samaan aikaan monta varsin ketterää isoakin taloa Suomesta, kuten Sanoma, Konecranes ja S-ryhmä.

Loppujen lopuksi digitalisaatio ei varsinaisesti välttämättä muuta organisaatioiden perustehtävää: niiden tulee edelleen kuunnella asiakkaitaan ja kehittää toimintaansa. Digitalisointi kuitenkin lisää käytössä olevan tiedon määrää ja nopeuttaa monia asioita, esimerkiksi automaation kautta, ja analytiikka auttaa seulomaan tiedosta ne kultajyväset.

Digitalisaation ytimeen pitäisikin nostaa tieto. Jokainen kohtaaminen tuottaa tietoa. Digikanavissa voi ihmisen lisäksi kohdata myös laitteen ja saada siltäkin tärkeää informaatiota. Digitalisaation todellinen arvo piileekin siinä, miten organisaatio hyödyntää kaikkea sen ulottuvilla olevaa tietoa – olipa se peräisin sitten kuluttajien tai koneiden käyttäytymisestä.

Me SASilla lupaamme pyrkiä tuomaan älyä digitalisaatiokeskusteluun. Ilman älyä digitalisaatio nimittäin jää helposti vain vanhojen asioiden tuunailuksi, kun sen perimmäisenä tarkoituksena olisi tarjota ihmisille enemmän laatuaikaa ihan kasvokkain. Vai oletko sinä eri mieltä? Kuulisin mielelläsi ajatuksistasi, joten kommentoi vapaasti tänne blogiin tai jatketaan juttua vaikka kasvokkainkin.

 

Post a Comment

Monitoring hospital-acquired infections

It is estimated that every tenth Danish patient suffers from an infection while being hospitalized and more than 3,000 patients die every year because of infections. However, so far, the actual number of infections has not been known, and therefore it has been difficult to find out where actions should be taken

Together with SAS Institute, Lillebælt Hospital in Denmark has developed a new monitoring system that shows hospital-acquired infections and enable the hospital to quantify how many patients catch an infection while being hospitalized. Professor at Lillebælt Hospital, Jens Kjølseth Møller, is the mastermind behind the new system for monitoring of infections that SAS Institute has made possible by using text analysis. Jens Kjølseth Møller expects the system to reduce infections in hospital by a third and thereby increase patient safety.

The system reflects Danish pioneer work that connects many years of clinical experience with modern technology. Breakthrough, says hospital CEO.

"Nobody can be satisfied that patients hospitalized at Danish hospitals are running the risk of getting a further disease. Therefore, working with patient safety is a key focus area for Lillebælt hospital. Now we have a tool that can read infections at patient level. This means that we can now enable our clinical managers to monitor our efforts and do more of the things that work. This is a new era in efforts to reduce hospital infections," says hospital CEO at Lillebælt Hospital.

The first complete monitoring system for infectionsHospital-acquired infections

Professor Jens Kjølseth Møller has previously developed a system that could make an overview of the number of infections by using the hospital's own files and databases. However, it was not possible to obtain a complete overview of the number of infections. The monitoring system could not detect the registered infections that doctors and nurses reported in medical records.

Therefore, Lillebælt Hospital chose to co-operate with SAS Institute because of their text analysis technology. The technology sets up rules and clinical terms that may present so-called 'triggers' relating to identification of an infection.

"We now know the number of infections during a given period in a given department. We went from fiddling with several figures to now in cooperation with SAS Institute to provide a management tool for doctors and head of departments. We provide SAS Institute with raw data, which then presents the information in a way that clinicians are used to retrieve other management information on," says Jens Kjølseth Møller.

Several hospitals request monitoring of infections

In 2014, for the first time, the department managers at Lillebælt Hospital could draw reports of all infections. The solution allows hospital and department managers to monitor the development of hospital-acquired infections with reports of overview and development over time from hospital level down to department and ward level.

"The information we have on patients today is the information that will help to avoid infections among patients tomorrow. Many of the things we already knew will be confirmed. But we will also be working more actively with what we call intervention, where we try to change procedures to see if we can do it better than we already do," says Jens Kjølseth Møller.

The professor is not only at a premium at Lillebælt Hospital, but is now being approached by everybody. In particular, the United States and our neighboring countries are interested in the new monitoring system. Here the question is about how the system can be customized to countries that do not have social security numbers or IDs like the Danish personal identification numbers.

Post a Comment

Welcome to Hidden Insights – The new Nordic blog about big data and predictive analytics

We realize that you might have ended up here by coincidence after activating a stored link for your favorite local SAS Blog. Nevertheless, do not worry, we will still address how Nordic companies and organizations can gain insight and get the power to know.

On the new Hidden Insights Blog, you will find bloggers from all over the Nordics. Some will blog in their native tongue, whereas others will do it in English. Whatever languages you prefer, you can filter the content you are looking for on your right hand side of the screen.

There are various ways to engage with the bloggers. Do feel free to join the discussion, comment, like or share the content you find on Hidden Insights. We also welcome guest blogger so if you have an idea for a blog within the bigdata analytics, you can reach out to me (per.hyldborg@sas.com).

We would also like to point you to the best content about SAS in the Nordics, advanced analytics and compelling industry insights.

Thank you for your attention.

Per

Post a Comment

Big Data and scientific research: What can a data warehouse consultant and a space physicist at NASA learn from each other?

What can a SAS Partner from Sweden and a space physicist at NASA learn from each other?

Read how Big Data can be a connection point between academia and the business world, where the two help each other learn new and old methods so that both parties can reach insights quicker. (Mr Mohammadi was graduated 2012 from our program of Data- and Business Analytics consultants/Data Scientist in Sweden).

jorden & solen

What could a space plasma physicist have in common with a data warehouse consultant? At first glance, not much, but is it really so?

If we try to remove space physics terminology, ignore fluid dynamics and Maxwell's equations, and instead focus on what these types of scientists actually do with their data, it might not be that foreign after all. The fact is that both numerical-physicists and experimentalists are completely dependent on their data for new insights. The data source is usually numeric simulations based on a model, or as in the case of an experimentalist, the instruments with which they measure their experiment. Both cases however can produce really Big Data, huge even.

Next, the physicist would prepare the data for analysis in different ways. If you have a data warehouse background, you might have just thought quietly for yourself “ETL?” (Extract-Transform-Load) and you would be right to think so, in some sense it is an ETL-process. Even though this might be true, data warehousing is used very little in this type of academic research. The question that comes to mind is: Can Big Data be a connection point between academia and the business world, where the two help each other learn new and old methods so that both parties can reach insights quicker? Yes, I actually believe that it could.

With these reoccurring thoughts we reached out to one of my former colleagues from my own space physics days, Lars Daldorff, who nowadays is contracted at NASA and works with numerical plasma simulations. We asked ourselves a simple question: What would happen if we would take simulations Lars had of the Sun and structured them in such a way that they could be uploaded into a Big Data, in-memory environment which used out-of-the-box analytical methods? The challenge that Lars Daldorff faced in his work at NASA was not to produce big data volumes, but to analyze it in an effective way. To illustrate the point, when we as Data warehouse consultants asked “How big is your data?”, the reply we got was “How big do you want it to be?”.

In the academic numeric world, the fast paced technical development and access to large super computer centers has meant that production of data can easily be scaled up. However, much of what is produced is of little or no scientific interest, it is simply already known physics or noise of different sort.

fig 1
Fig 1. A simplified description of the existing process on how you work with scientific data to find insights about it.

Basically this is a needle in a haystack situation, the phenomena of interest is somewhere in the data, but you usually do not know “where” or even “when” in the data it can be found. At the same time, the visualization and analysis methods usually used are time consuming. A consequence of this is that the researcher in question (in this case, a physicist) needs to slice the data by making qualified guesses “where” and “when” in the data the needle is. A simplified description of the process can be seen in Fig. 1. The problem with this process is that even if you are lucky and just happen to find an interesting phenomenon on your first guess, you can’t be sure that it is the only phenomena of interest in the data.

This problem means that the time between the gathering of data (from numerical simulations of the sun in our case) and insight about your data becomes very long. But what if you didn’t have to visualize your data in slices? What if we could take out the guess work from the process? What if it would be possible to upload all of your data at once into a platform which would instantly tell you where the needle (or needles) is (are) by using standardized methods? What if, after you have found the needle(s), you could simply export the data of interest to do your full analysis on only that which matters?

Why speculate? Let’s do it!

Soon, the collaboration had started and the first analysis started to come out. The phenomena that they wanted to study was simulations of the sun, or more specifically the magnetic arches associated with solar spots, which contribute to a considerable increase the X-ray and ultraviolet radiation from the outer solar atmosphere (and hence into the upper atmosphere of earth) and how these arches arise. The phenomena can be seen in this beautiful YouTube clip that the Heliophysics group at NASA recently released as part of the “SDO” (Solar Dynamics Observatory) project.

https://www.youtube.com/watch?v=GSVv40M2aks

There are still many open questions regarding these phenomena today, however its effects are clearly visible as you can see in the clip above. When these powerful arches are created, there are speculations that a phenomena called “magnetic reconnection” occurs. It’s this moment in the data that you need to identify, both spatially and in time, that is, both“where?” and “when?”.

Fig 2
Fig. 2. Simplified description of the new process for “From creating/collecting data to insight” , where we first load the entire data, automatically analyze and visualize all candidates of “Point of Interest” (POI) and then export the data for deeper analysis performed by the subject expert.

We loaded the entire data set into SAS Visual Analytics, in our own setup on Microsoft Azures cloud environment, and helped Lars Daldorff get started with the platform, then we could start looking for the needle(s) in the haystack. The aim was to automatically identify where and when the phenomena occurs, for all possible candidates. We wanted to replace the circular process described in Fig. 1 with the linear process described in Fig. 2. This could simplify and speed up how you actually get results and find insights, in this particular case regarding how the sun works, maybe in your case, how your customers work.

Fig 3
Fig. 3: Shows simulated data for one of the many “arches” that are formed at the surface of the sun and how we used SAS Visual Analytics to identify the crucial moment, the needle in the haystack, with the help of heat-maps and decision trees.

What we see in Fig. 3 is how standardized methods which are widely used in the business world, suddenly find use for a completely different type of data. These tools and methods don’t care what your data is, the methods for identifying Points of Interest, performing analysis, visualizations and creating reports are the same, regardless if it’s used on business data or scientific data.

Something the academic world is generally very good at, is to experiment on their data, dare to play with it, explore it with the mindset “I don’t really know what I will find, but I hope it’s something interesting!”, or to quote a famous physicist

“Experiment is the only means of knowledge at our disposal. Everything else is poetry, imagination.” - Max Planck

This is something we in the business world really could learn from. You don’t always need to know in advance what explicit report or analysis the work should result in, there is great value in having all of your data easily accessible at the tip of your fingers so that you can experiment on it and through your experiments reach new insights about your business.

In conclusion, this is only the beginning. We have already sent these preliminary results of our collaboration to Joint Statistical Meeting in Seattle (http://www.amstat.org/meetings/jsm/2015/) and received approval to present it during the conference in August 2015. Our hope is that these results can help Lars Daldorff in his research at NASA and that this case can help show the use of explorative analysis of data. There are numerous possibilities for this application and could potentially help different types of researchers obtain quicker insights and results. More updates will follow! Keep your eyes open!

Acknowledgments
This work has been made possible by a very good collaboration between Lars Daldorff (contracted researcher at NASA) and Infotrek (a data warehouse consultancy company in Sweden) with contributions by Saiam Mufti, Lars Tynelius and support from SAS institute Sweden. Also thanks to Laura Fernandes for helping with this text.

 

Post a Comment

What About Risk Control?

20 years ago, risk was controlled by small departments in the banks.
As authorities felt they were held hostage by the systemic importance of banks, they introduced requirements to banks to become more solvent. In order to get the behavior of banks under control, many types of reports were also required. From this point in time, the working description of risk departments shifted.
Two new missions were introduced: to prove they had no risks and to assure that they were compliant. This shift has given authorities a feeling of control, however, leaves one important issue unattended: Who is now controlling the risk?

RISK small

In banks, different type of risks are constantly building and the bubble often pops at an early stage. Regulation has forced banks to work with the capital as a steering mechanism, which is a reactive way of working with the risks.
In other words, you make sure that there is enough money to pay out if the bubble grows larger than you thought. But what about the guys with the needles assuring the bubbles do not grow out of proportion?

The task of “timely popping bubbles” requires an agile risk control function with clear mandates in order to be able to proactively decrease risks. Instead of focusing on the capital to cover losses, a more forward-leaning risk control can be a part of the business.
Some examples where this can be pushed much further than today are e.g. support development of new products with a risk angle, through clearer mandate in M&As and stronger mandates in working with temporary portfolio limitations.

In many countries, this is important right now! With the volume focus from tougher competition, increasing loan to salary ratio in households and stretched market prices and macro data, risks are building.

I say: Reclaim Risk Control!

 

Post a Comment

Vi skal fejre de bedste eksempler på sammenhængende borgerservice

Igennem de seneste 10 år er der foregået en gennemgribende digitalisering af en række services vendt mod danske borgere. Selvbetjeningsløsninger er vundet frem og bl.a. er mange blanketter blevet digitaliserede. Det har med rette givet Danmark en stærk placering i Europa og verden, at vi tidligt tog livtag med denne opgave i det offentlige. Men hvis vi for alvor skal høste gevinsterne, skal der skabes endnu bedre sammenhæng mellem de borgervendte services og de bagvedliggende systemer – og nu skal en ny pris, Den Offentlige Servicepris, sætte spotlight på de bedste løsninger og eksempler på, at borgernes behov sættes i højsædet.

Samarbejdspartnerne bag Den Offentlige Servicepris ønsker at sætte fokus på behovet for en fælles forståelse af og grundlag for niveauet af offentlig service. Partnerne er min egen virksomhed, SAS Institute, analysevirksomheden Wilke og konsulenthuset PA Consulting Group. Sammen skaber vi en stor benchmark-undersøgelse af serviceniveauet i en række offentlige virksomheder og organisationer og finde dem, som bedst formår at skabe et sammenhængende og letanvendeligt tilbud, der skaber tilfredshed blandt brugerne.

Fokus på værdi frem for bureaukrati

Det er her vigtigt at understrege, at de offentlige ledere bestemt ikke har en let opgave med at opdatere servicetilbuddet til den moderne borgers/virksomheds forventninger, når man tænker på de budgetmæssige og tekniske begrænsninger, mange offentlige virksomheder lider under. Desto mere grund til at fejre, når visionære offentlige organisationer lykkes med denne svære øvelse - for vi skylder vores fælles velfærdssystem, at vi bliver ved med at stræbe efter at gøre det bedre. Vi kan vist også alle blive enige om, at offentlige ansatte skal bruge mest tid på det, der giver mest værdi for borgerne og mindst på bureaukrati og teknikaliteter – og derfor skal vi finde de løsninger, som bedst tilgodeser både borgeren og de offentlige medarbejdere.

Offentlige ledere kan tilmelde deres virksomheder til et serviceeftersyn på www.offentligservicepris.dk og får en konkret benchmark-undersøgelse af virksomhedens placering i forhold til andre serviceorganer i den offentlige sektor.

Offentlig-servicepris_logo

Den 22. oktober vil vi første gang kåre en række vindere af Den Offentlige Servicepris. Men selv om vi absolut mener, at vinderne bør hædres, er det mere overordnede håb at vi via prisen får skabt opmærksomhed om og skærpet ambitionerne for, hvordan vi ønsker at betjene vores borgere samt virksomheder og hvilken oplevelse, de skal sidde tilbage med efter mødet med de offentlige services.

Jeg håber, at vi vil se rigtig mange tilmeldte og en god og relevant debat om fremtidens offentlige servicetilbud. Vi har allerede sat snakken i gang – kom og vær med på Den Offentlige Servicepris’ LinkedIn-gruppe, som du finder her.

Post a Comment

Building customer profiles in the era of big data

Following my latest blogpost on the subject of executing omni-channel strategies, the first step is to obtain a greater overview and understanding of your customers.

In recent years, the massive increase in data volumes and the variety of sources have completely changed the concept of customer profiles for most companies. From a marketer's view, this has created what we could refer to as a ‘big data challenge’ in which it becomes very complex to gain an overview of individual customers across multiple platforms and contact points and embed the digital data residing here.

Extensive benefits can be gained from working with these types of master customer profiles, including greater effect of customer interactions and efficiency relating to the underlying processes.

In my experience, master customer profiles can be established through four sets of activities:

  1. Identify customer data
  2. Connect customer identifiers
  3. Establish data hierarchies
  4. Build customer profiles with automated data flows

Activity 1: Identify customer data
The initial activity is to identify and gather customer data located internally and externally. Internal sources are often scattered across the organization, and valuable insights into customer demands and market trends can be derived from these data. Hand in hand there are a great variety of external data sources available such as social media, competitor activities, demographic profiles etc.

Figure 1 illustrates different data sources – and depending on whether they are internal or external, the volumes and variety of sources create big data challenges to which we need a method of structuring data in an intelligent way. This is why it becomes necessary to focus in on the customer identifiers.

Activity1
Figure 1: Examples of internal and external data sources. With the increasing number of digital platforms and customer engagement, a much wider range of rich data sources is available.

Activity 2: Connect customer identifiers
With the explosion of digital platforms, customers have a wide range of identifiers beyond the traditional ones, i.e. accounts or loyalty memberships. In order to gain a complete picture of the customer, we have to commence the work on gaining a picture of our customers across these platforms. This will provide access to new data, more arenas for interactions as well as understanding of the synergies and mechanism behind these.

Figure 2 shows a number of different digital platforms and identifiers which we need to understand and link together. Once these have been linked, we need to build customer profiles.

Activity2
Figure 2: Examples of various identification profiles linked to an individual customer. The picture expands continuously as new digital platforms emerge.

Activity 3: Establish data hierarchies
Once we have a clear view of the data available at present and the connection between the different identifiers related to each customer, the next step is to create the master level – or data hierarchies.

Several types of hierarchies exist such as household level, product level and customer level. Household level is key in most industries, as we need to understand e.g. which customers who have already purchased a product from us and who have not. Figure 3 illustrates this.

An example could be an insurer who wishes to understand which customers do not yet have a household insurance. Looking only at customers without an insurance is not sufficient in this aspect, as it does not take into account which customers live together. You might end up wasting marketing investments and compromising contact policies by promoting this to customers already covered through their spouse.

The identification of stakeholders in a household can provide additional insight, e.g. for segmentation purposes.

Another relevant hierarchy to work with is companies that have either several brands or sub-organizations. As brands can have different underlying categories and even more products, the complexity can often be substantial. As we want to obtain an overview of our relations to customers and potentials for cross- and up-selling, the establishment or inclusion of product hierarchies are important.

Once the mapping is done, we cannot rest on our laurels. As part of the process of generating insights, we need to be able to work actively with the mapping on a continuous basis. More on this subject in my next blogpost.

Activity3
Figure 3: Examples of different types of hierarchical levels that are relevant when establishing and working with customer profiles. The most common ones are on household and product level.

Activity 4: Build customer profiles with automated data flows
Once we have an overview of our data, we can start to develop the bridges between the different systems. I prefer to work with virtual layers, rather than commencing a data warehouse project. As data tends to change over time, we need a flexible foundation to work with – a good example is actuality (look at e.g. fax numbers). As some data sources such as individual customer behavior on websites can provide extensive data volumes, we need to consider storage. Storage options have changed significantly brought forth by a little yellow elephant.

These types of customer profiles have also been named DNA strands by some people, as you might picture all these data as genes on a strand; all connected to each customer enabling us to distinguish them from one another.

What I have found significant in this aspect is that the data bridges need to be automated, and as a key element in terms of working omni-channel – this needs to happen in real time. This demands a change in the old siloed infrastructure in which data flows back and forth in a continuous flow – running data quality processes, running propensity scores, checking contact policies etc. As customers interact with us at all times, we need to move beyond nightly batches and into a world where data is kept fresh at all times.

Figure 4 illustrates this virtual data layer with various data from different sources summing up the customer profile – and with seamless integration in real time illustrated by the arrows.

Activity4
Figure 4: Example of a flexible data foundation drawing upon data from various sources – maintained in real time for optimal customer interaction in any channel at any time of the day.

Start out building the foundation
Building customer profiles is a key step in terms of executing an omni-channel strategy – and the one, which often demands the greatest effort. It lays the foundation for everything we do as it provides insight into both our customers and markets, internal processes and organization – in general, the current possibilities and limitations.

As I explained in my previous blogpost, the process of executing an omni-channel strategy is an iterative process – and this goes for the individual steps as well. Building customer profiles is no exception as my experience shows that new data continues to appear either from within the organization or externally through e.g. enriched data as well as new identifiers.

For this reason, this step is what I recommend as the first when commencing this type of journey. Data will always be key in any type of project, as it defines the possibilities as well as limitations. To that, any marketing or sales concept should always be validated and supported by facts, which is why I always suggest including data exploration as part of the creative processes.

Post a Comment

Are Nordic companies maturing their big data journey?

For years, large geographical regions such as the US and the UK have experimented with big data analytics to gain true business advantages – creating an opportunity to adapt better to customer demands, to optimize business models and to mitigate risk. Meanwhile, most companies in the Nordic countries have been in a watch-and-wait mode and not set out on a big data journey to adopt technologies such as Hadoop.

During the past year, SAS Institute has experienced an increasing interest from organizations anxious to do more with the vast variety of data available to them. For this reason, SAS decided to take the temperature of the adoption of Hadoop technologies and big data analytics in the Nordic region as well as the maturity by country and industry, and what the primary causes, use cases and obstacles are at present.

In December 2014, we conducted telephone interviews and an online survey with more than 300 IT managers from companies across industries in Denmark, Sweden, Norway and Finland.

We found that 85% of companies in the Nordic region have a growing need for performing analytics on more data, and 6 out of 10 see a need to collect new types of data (such as clickstream, machine log, text, sensor, social media, videos and images).

The Nordics companies know that it is time to consider a new approach. As much as 92% believe they will gain competitive business advantages if they have a broader information picture for analytics and business decisions. Especially customer and market Intelligence, operational efficiency and innovation/R&D are key areas for setting out on a big data journey and consider adopting a technology such as Hadoop as a platform for exploitative analytics. Moving volume marketing to a more personal approach will open up for new markets, products and ways to grow a company.

This is all good. However, only half of the companies believe they have an infrastructure capable of meeting these needs and demands today. But Nordic companies are ramping up. According to the survey, 9% of the companies either have or are about to install Hadoop, and this adoption seems to double in 2015 – taking shadow IT into consideration, these numbers might be higher.

The need for information, faster, to make timely decisions on new and changing market opportunities, customer expectations and risks is a factor of which most of the organizations are aware. Yet only 13% believe they meet these demands satisfactorily today – and what happens when you include more data and new types of data?

A couple of reasons for the slower adoption of big data platforms such as Hadoop, in the Nordic region, seems to be concerns about the need to acquire new resources and skills not currently in the organization as well as uncertainty about the technology maturity.

What we experience is that the technology has matured and has a proven record – and in many cases, organizations already have people with insight in data and analytics. So is this just a matter of others to show how it should be done, and does the real issue more likely concern who actually owns a big data analytics initiative within the organization?

Maybe not surprisingly, companies within telecommunications and financial services are the most mature, leading the way in the Nordic region. The midsize and large companies in several other industries follow suit – and as more and more companies mature, organize themselves and show their competitive edge by adopting big data analytics – the buzz will disappear.

Learn more about the finding from the Nordic survey

Post a Comment