Orig­i­nal source pub­li­ca­tion: Poló­nia, F. and F. de Sá-Soares (2013). Key Issues in Infor­ma­tion Sys­tems Secu­rity Man­age­ment. Pro­ceed­ings of the 33th Inter­na­tional Con­fer­ence on Infor­ma­tion Sys­tems—ICIS 2013. Milan (Italy).
The final pub­li­ca­tion is avail­able here.

Key Issues in Infor­ma­tion Sys­tems Secu­rity Man­age­ment

Fer­nando Poló­nia and Fil­ipe de Sá-Soares

Uni­ver­si­dade do Minho, Cen­tro Algo­ritmi, Guimarães, Por­tu­gal

Abstract

The increas­ing depen­dence of orga­ni­za­tions on infor­ma­tion and the need to pro­tect it from numer­ous threats jus­tify the orga­ni­za­tional activ­ity of infor­ma­tion sys­tems secu­rity man­age­ment. Man­agers respon­si­ble for safe­guard­ing infor­ma­tion sys­tems assets are con­fronted with sev­eral chal­lenges. From the prac­ti­tion­ers’ point of view, those chal­lenges may be under­stood as the fun­da­men­tal key issues they must deal with in the course of their pro­fes­sional activ­i­ties. This research aims to iden­tify and pri­or­i­tize the key issues that infor­ma­tion sys­tems secu­rity man­agers face, or believe they will face, in the near future. The Del­phi method com­bined with Q-sort tech­nique was employed using an ini­tial sur­vey obtained from lit­er­a­ture review fol­lowed by semi-struc­tured inter­views with respon­dents. A mod­er­ate con­sen­sus was found after three rounds with a high sta­bil­ity of results between rounds. A ranked list of 26 key issues is pre­sented and dis­cussed. Sug­ges­tions for future work are made.

Key­words: Key Issues; Infor­ma­tion Sys­tems Secu­rity; Infor­ma­tion Sys­tems Secu­rity Man­age­ment; Infor­ma­tion Sys­tems Man­age­ment; Del­phi; Q-Sort

Introduction

In a fast-chang­ing busi­ness envi­ron­ment, where inno­v­a­tive uses of infor­ma­tion made pos­si­ble by new tech­nolo­gies occur at a rapid pace and are accom­pa­nied by the emer­gence of new threats to infor­ma­tion assets, infor­ma­tion sys­tems (IS) man­agers face myr­iad chal­lenges to main­tain an ade­quate IS secu­rity (ISS) level and, by exten­sion, to pre­serve the integrity of orga­ni­za­tions and their abil­ity to sur­vive and thrive in the mar­ket. From the prac­ti­tion­ers’ point of view, those chal­lenges may be under­stood as the fun­da­men­tal key issues they deal with in the course of their pro­fes­sional activ­i­ties.

This study aims to iden­tify and pri­or­i­tize the key issues that ISS man­agers face, or believe they will face in 5 to 10 years. Cap­tur­ing and clas­si­fy­ing those issues accord­ing to their impor­tance is of value to dif­fer­ent stake­hold­ers. Namely, it can assist top level man­age­ment in the analy­sis of IS strate­gic invest­ments; guide ven­dors in the devel­op­ment of secu­rity prod­ucts; improve IS man­agers’ under­stand­ing of the activ­i­ties and respon­si­bil­i­ties that sur­round their job; make infor­ma­tion secu­rity con­sul­tants aware of the most impor­tant ISS issues in the indus­try’s real­ity as sensed by ISS man­agers; and to sug­gest poten­tial areas of inter­est for researchers to inquire deeper.

The paper is orga­nized as fol­lows. First, we review the major stud­ies on key issues con­ducted in the area of IS. Then, the research design is out­lined, fol­lowed by the descrip­tion of the study that was con­ducted. After­wards, we present and dis­cuss the main find­ings of the research. The paper ends by draw­ing con­clu­sions, iden­ti­fy­ing lim­i­ta­tions, and advanc­ing future work oppor­tu­ni­ties.

Studies of Key Issues in Information Systems

The field of IS has had a num­ber of stud­ies aimed at iden­ti­fy­ing key con­cerns of IT man­age­ment exec­u­tives by ask­ing a panel of experts their opin­ion on a given sub­ject. This tra­di­tion of ask­ing a panel to sort their most impor­tant con­cerns began in 1980, in the USA, when the Soci­ety for Infor­ma­tion Man­age­ment (SIM) com­mis­sioned a sur­vey to uncover the key issues its mem­bers were fac­ing. Sub­se­quent stud­ies were con­ducted at approx­i­mately three-year inter­vals. After a gap of nine years, between 1994 and 2003, the sur­veys were resumed on an annual basis. Table 1 sum­ma­rizes those stud­ies. For each study, it is indi­cated the year of the sur­vey, the num­ber of times (rounds) par­tic­i­pants were asked to clas­sify the issues, the num­ber of par­tic­i­pants per round, and the num­ber of issues sub­ject to rat­ing.

Table 1: Stud­ies on Infor­ma­tion Sys­tems Man­age­ment Key Issues

Table 1

The stud­ies on IS man­age­ment con­cerns spon­sored by SIM were able to iden­tify and pri­or­i­tize sev­eral issues deemed impor­tant to IT exec­u­tives. The peri­od­i­cal pro­mo­tion of the sur­veys allowed the spec­u­la­tion on the lon­gi­tu­di­nal evo­lu­tion of the key issues, the high­light­ing of endur­ing con­cerns, the con­sid­er­a­tion of new chal­lenges, and the iden­ti­fi­ca­tion of the pass­ing issues that became no longer rel­e­vant.

From a method­olog­i­cal point of view, the stud­ies pro­moted by the SIM from 1983 to 1994 applied the Del­phi method, a com­mon tech­nique used to gather infor­ma­tion from a panel, requir­ing sev­eral sur­vey rounds to iden­tify and rank the key issues. The use of suc­ces­sive rounds in a study forces the panel to find a con­sen­sus, a con­sol­i­dated result from the entire panel as a whole on the order of impor­tance of the issues. Some stud­ies com­bined Del­phi with other research meth­ods, such as tele­phone inter­views, enrich­ing the study with a qual­i­ta­tive com­po­nent. From 2003 to 2011, the SIM Board decided to query respon­dents in a sin­gle round, accord­ing to a pro­ce­dure sim­i­lar to the one used in the 1980 study.

In most of these stud­ies, con­cerns sur­round­ing infor­ma­tion secu­rity aspects have been present. In 1980 the issueProb­lems of Main­tain­ing Data Secu­rity” ranked twelfth. In 1983 posi­tion 14 was occu­pied by the issueInfor­ma­tion secu­rity and con­trol”. After being dropped from the study of 1994, this secu­rity issue reap­peared in 2003, ascend­ing to the 3rd place under the titleSecu­rity and Pri­vacy”. Since then, it has been on the top 10 list of sub­se­quent stud­ies.

Con­sid­er­ing the evo­lu­tion of secu­rity related issues over the last 30 years, it should be noted that the 1980 study also included the issueProb­lems of Main­tain­ing Infor­ma­tion Pri­vacy” that ranked four­teenth. Ten years later, in 1990, a new related issue joined the list:Estab­lish­ing Effec­tive Dis­as­ter Recov­ery Capa­bil­i­ties” (ranked 20th). An impor­tant obser­va­tion relates to how the secu­rity issue was per­ceived by the authors of the stud­ies. Until 2010, secu­rity and pri­vacy were viewed as a tech­ni­cal con­sid­er­a­tion and theonly tech­ni­cal issue” [Luft­man and Ben-Zvi 2010a, p. 267] in the top 10 man­age­ment con­cerns from 2007 to 2010. How­ever, in the last three stud­ies (2010-2012), the authors denote a change in the way this issue is under­stood, observ­ing thatsecu­rity is rec­og­nized as a man­age­ment issue, rather than purely a tech­ni­cal one” [Luft­man and Derk­sen 2012, p. 211].

The analy­sis of the stud­ies shows that infor­ma­tion secu­rity con­cerns appear mixed with the var­i­ous con­cerns of IS man­agers. This real­iza­tion was the main moti­va­tion for this study and led to the for­mu­la­tion of the fol­low­ing research ques­tion: What are the key issues that Infor­ma­tion Sys­tems Secu­rity man­agers face, or believe they will face in 5 to 10 years? In other words, we argue for the need to look inside the infor­ma­tion secu­rity con­cerns of the pre­vi­ous stud­ies, dis­sect­ing and iso­lat­ing the wordsecu­rity” by iden­ti­fy­ing the var­i­ous aspects that may com­pose it.

Research Design

To accom­plish the study’s objec­tive, the Del­phi method was selected. In the IS field, the pop­u­lar­ity of this research strat­egy is par­tic­u­larly high when key issue clas­si­fi­ca­tion stud­ies are con­ducted [Okoli and Pawlowski 2004]. Del­phi uses a series of linked ques­tion­naires, usu­ally know as rounds, where par­tic­i­pants are con­tin­u­ally asked to re-eval­u­ate their answers (to the same ques­tions) in light of the sum­ma­rized group result of the pre­vi­ous round. That is, after each round, results are sum­ma­rized, given back to the experts, and a new (usu­ally iden­ti­cal) ques­tion­naire is pro­vided for another eval­u­a­tion. For this eval­u­a­tion the expert should take into account the aggre­gated opin­ion of the other experts. Three to five ques­tion­naires are usu­ally needed until a con­sen­sus among par­tic­i­pants is achieved [Del­beq 1986]. Researchers usu­ally apply this method expect­ing con­sen­sus to be reached among par­tic­i­pants. If that is not the case, the results may still give impor­tant insights, since the Del­phi method may iden­tify diver­gences and con­tra­dic­tions between par­tic­i­pants, the analy­sis of which is as valu­able as that done on data from con­sen­sus [Proc­ter and Hunt 1994].

Since the aim of the study was not lim­ited to the enu­mer­a­tion of ISS man­age­ment con­cerns, but also intended to cat­e­go­rize them accord­ing to their level of impor­tance, we asked par­tic­i­pants to clas­sify the issues by rank­ing them via the Q-sort tech­nique. The Q-sort tech­nique con­sists of a sort­ing pro­ce­dure where a set of cards with inscrip­tions (phrases, words, or fig­ures) is laid down in a pyra­mid. In this study, for each key issue there was a cor­re­spond­ing card. Q-sort has a set of spe­cific pro­ce­dures that have to be per­formed by par­tic­i­pants. These pro­ce­dures start with the famil­iar­iza­tion of the sub­ject with all the cards.

For that step, each key issue des­ig­na­tion was sup­ple­mented with a brief state­ment mak­ing explicit its intended mean­ing. After­wards, the sub­ject should sep­a­rate the cards into three dis­tinct groups:very impor­tant”,less impor­tant”, andneu­tral, ambiva­lent, or of median impor­tance”. Fol­low­ing this step, an iter­a­tive process starts where the par­tic­i­pant extracts, one at a time, the most impor­tant issue from thevery impor­tant” group and the least impor­tant issue from theless impor­tant” group. The process goes on until all cards from both groups are used. Finally, the par­tic­i­pant sorts the most impor­tant issues from theneu­tral” group and makes arrange­ments so that the rank­ing reflects his/her best opin­ion. The option of Q-sort for the rank­ing step pre­vents the assess­ment of each issue inde­pen­dently as if there was no con­nec­tion with the remain­ing issues.

At the design stage of this research the pos­si­bil­ity of con­duct­ing inter­views with par­tic­i­pants in the Del­phi study after the last round was also fore­seen. This would add extra value if the analy­sis of the Del­phi results jus­ti­fied the need for col­lect­ing qual­i­ta­tive data that would allow a bet­ter under­stand­ing of the final list of ISS key issues or the final rank­ing.

The setup of this study involved the fol­low­ing tasks that will be detailed next: selec­tion and invi­ta­tion of experts to the Del­phi panel, deci­sion on the design of rounds, def­i­n­i­tion of the means of com­mu­ni­ca­tion with experts, and def­i­n­i­tion of round-stop­ping cri­te­ria.

Delphi Panel

The Del­phi panel is the set of experts that par­tic­i­pate in the study. There are no rules for the inclu­sion of mem­bers in a panel, except for the fact that they are con­sid­ered to be experts [Pre­ble 1984]. This implies that the choices are based on the judg­ment of the researchers and on how they defineexpert”. Beretta [1996] notes that when the opin­ion of experts is required, it is not gen­er­ally appro­pri­ate to employ ran­dom sam­pling. An expert is under­stood as some­one that is a spe­cial­ist in a field of study or some­one with knowl­edge and expe­ri­ence regard­ing a spe­cific issue, whether that knowl­edge is empiric or oth­er­wise. In terms of panel size, there is no con­sen­sus in the lit­er­a­ture. Although there is no evi­dence that wide­pan­els are nec­es­sary [Duffield 1993], a qual­i­ta­tive aspect that should be con­sid­ered is the het­ero­gene­ity or inter­dis­ci­pli­nary nature of the panel, with the objec­tive of increas­ing the valid­ity of results [Mar­tino 1972].

Since we did not have a prior pool of ISS man­agers from which we could form the panel, we started by iden­ti­fy­ing the 500 largest com­pa­nies in Por­tu­gal. To this list we added the top 200 IT com­pa­nies in Por­tu­gal (by rev­enue). These com­pa­nies were con­tacted by email, fax, and tele­phone. An invi­ta­tion was sent to col­lab­o­rate in the study by request­ing the name, email address, and phone con­tacts of the pro­fes­sional respon­si­ble for ISS mat­ters in the orga­ni­za­tion. Pub­lic ser­vices, hos­pi­tals, health care cen­ters, uni­ver­si­ties, insti­tutes, and col­leges were also con­tacted. Also included were con­tacts obtained through aca­d­e­mic and pro­fes­sional con­fer­ence pro­grams and work­shops, as well as con­tacts sug­gested by experts in the area. From this process, 182 experts were iden­ti­fied. A per­sonal let­ter fol­lowed to each of these, invit­ing them to par­tic­i­pate in the first round of the study, by giv­ing their opin­ion on the impor­tance of each issue.

Design of Rounds

In what con­cerns the design of rounds, we had to decide whether the first round would be a blank-sheet round, to be used to col­lect from par­tic­i­pants their opin­ions on what are the key issues in ISS man­age­ment, or a reg­u­lar rank­ing round, requir­ing the pro­vi­sion of a list of poten­tial key issues that par­tic­i­pants would clas­sify. We chose the lat­ter, mainly because we could com­pose an ini­tial list of issues from lit­er­a­ture to reduce the effort required from par­tic­i­pants. It was also decided that in the first round par­tic­i­pants could add new con­cerns to the orig­i­nal list of issues com­piled. This ensured that the list could be enriched with issues that some­how escaped the scope of lit­er­a­ture, giv­ing experts the abil­ity to tweak the list in case they found that the pre­sented issues did not best reflect their real­ity. The sub­se­quent rounds would be closed (par­tic­i­pants would not be able to add new items). This deci­sion also helped to keep the num­ber of rounds within a rea­son­able limit. An addi­tional deci­sion was to not drop the issues that were con­sid­ered less impor­tant, except if at the end of the first round the num­ber of new con­cerns even­tu­ally sug­gested by par­tic­i­pants would ren­der the next round pro­hib­i­tive in terms of respon­dents’ effort.

To col­lect data, we used a Web tool to admin­is­ter the sur­veys. The avail­abil­ity of such a tool gives par­tic­i­pants the con­ve­nience of answer­ing and reduces the time between rounds, help­ing the panel to keep the enthu­si­asm and elim­i­nat­ing time spent on the data tran­scrip­tion phase. The tool imple­mented the pro­ce­dures asso­ci­ated with the Q-sort tech­nique, thus ensur­ing the par­tic­i­pants would cor­rectly per­form its steps.

Communication Protocol

The com­mu­ni­ca­tion pro­to­col to be used between the research team and the par­tic­i­pants dur­ing the Del­phi rounds had a set of pre­de­fined rules. An email mes­sage sent to each par­tic­i­pant ini­ti­ated each round. This mes­sage included infor­ma­tion about the pur­pose of the study, by recall­ing and explain­ing the research ques­tion that would guide the expert’s par­tic­i­pa­tion; the goal and design of the round; the period within which it would be open to answers; the Web link and cre­den­tials that would enable par­tic­i­pa­tion; a guar­an­tee of con­fi­den­tial­ity and anonymity; an empha­sis on the impor­tance of the expert’s con­tri­bu­tion to the suc­cess of the study; indi­ca­tion of researchers’ con­tacts for any unfore­seen cir­cum­stance; and for the sec­ond and sub­se­quent rounds, a brief descrip­tion of the con­sen­sus obtained in the pre­vi­ous round. The evo­lu­tion of the answers in each round would be closely mon­i­tored by researchers. To those par­tic­i­pants who had not yet responded, a reminder would be sent by email two days before the clos­ing date of the round and on the day the answer­ing period would end. In case there was evi­dence of dif­fi­cul­ties or unavail­abil­ity of the par­tic­i­pants, the answer­ing period would be extended. Finally, every con­tact made by par­tic­i­pants would be replied to promptly by the research team.

Stop Criteria

The major­ity of the Del­phi stud­ies that were reviewed define two main cri­te­ria for the Del­phi rounds to stop. The first one dic­tates that rounds should stop in the absence of pro­gres­sion of con­sen­sus between rounds, or alter­na­tively, when a high con­sen­sus degree in one round is achieved. The sec­ond con­sists of a pre-defined max­i­mum num­ber of rounds whose deter­mi­na­tion is based on the behav­ior of experts observed on pre­vi­ous stud­ies, inde­pen­dently of the val­ues obtained for con­sen­sus. Regard­ing the first stop cri­te­ria, we used the non­para­met­ric tests of Kendall’s coef­fi­cient of con­cor­dance (W) and Kendall’s ranko­rder cor­re­la­tion coef­fi­cient (T) to eval­u­ate if a high con­sen­sus was obtained or if pro­gres­sion between rounds stopped.

The com­pu­ta­tion of Kendall’s W out­puts a numeric value in the inter­val [0, 1], indi­cat­ing per­cent­age val­ues that vary betweenlack of agree­ment” (0%) andcom­plete agree­ment” (100%). Schmidt [1997] put for­ward a qual­i­ta­tive table to inter­pret the val­ues of this con­cor­dance coef­fi­cient (also called the con­sen­sus degree), such as pre­sented in Table 2. This coef­fi­cient is widely used in key issues stud­ies with Del­phi [Brancheau and Wetherbe 1987; Okoli and Pawlowski 2004]. The value of W is cal­cu­lated for each round, indi­cat­ing the agree­ment obtained between the experts in the par­tic­u­lar round.

Table 2: Inter­pre­ta­tion of Kendall’s W
Adapted from Schmidt [1997]

Table 2

Degree of con­cor­dance indi­cates the level of con­sen­sus in the eval­u­a­tion made by experts, on the same set of key issues, in a given Del­phi round. How­ever, it is also nec­es­sary to mea­sure the degree of con­cor­dance between each eval­u­a­tion made by the panel as a whole, over time. This last mea­sure­ment reveals the sta­bil­ity of the panel’s selec­tion through­out the rounds, indi­cat­ing if the opin­ions are con­verg­ing or diverg­ing as a whole. Kendall’s rank-order cor­re­la­tion coef­fi­cient (T) was the met­ric applied to deter­mine this. It out­puts a value between -1 (per­fect dis­agree­ment) and 1 (per­fect agree­ment), with 0 sug­gest­ing that the vari­ables involved are inde­pen­dent and there­fore unre­lated. In this study we expected to see an increase of the pos­i­tive coef­fi­cient over time.

It is advis­able to estab­lish a max­i­mum num­ber of rounds, with Proc­ter and Hunt [1994] observ­ing that two or three rounds are the usual limit. A high num­ber of rounds may have neg­a­tive con­se­quences as there is a high prob­a­bil­ity of exhaus­tion of the panel [Mars­den et al. 2003]. Hence, in this study we estab­lished that the max­i­mum num­ber of rounds would be three, inde­pen­dently from the results.

To sum up, in the present study, the stop cri­te­ria relied on two dif­fer­ent con­di­tions, being the rounds inter­rupted in case of one round pre­sent­ing a high con­sen­sus value that has not pro­gressed from the pre­ced­ing round, or, in the absence of this con­di­tion, at the third round.

Description of the Study

The study involved the elab­o­ra­tion of the ini­tial list of ISS key issues, the admin­is­tra­tion of sur­veys over three Del­phi rounds, and a final inter­view to a sub­set of respon­dents to the last two rounds.

Development of the ISS Key Issues List

Fol­low­ing the deci­sion to not start the Del­phi study with a blank-sheet round, we had first to develop a list of key issues in the area of ISS that would be pre­sented to par­tic­i­pants in the ini­tial Del­phi round. We devel­oped this list of issues from the lit­er­a­ture. Mag­a­zines, jour­nals, books, and con­fer­ence pro­ceed­ings in the area of inter­est are the most widely used resources for build­ing such a list. In this tech­nique, a time limit needs to be defined to nar­row the lit­er­a­ture analy­sis, that is, all sources would be scru­ti­nized, from a given year to the present date. This deci­sion was made con­sid­er­ing the aim of the study, namely that the key issues should illus­trate the main con­cerns of ISS man­agers. There­fore, the lit­er­a­ture should reflect issues that are (or will be) impor­tant, among them issues that are con­stantly present in this area of knowl­edge. Both new and resilient issues can be found in recent pub­li­ca­tions as they are either a nov­elty, or a con­stant con­cern; thus, going back to older pub­li­ca­tions will only add issues that are no longer cur­rent.

Sev­eral mag­a­zines and jour­nals in the areas of IS man­age­ment and ISS were selected to be ana­lyzed in order to dis­cover the issues cur­rently being dis­cussed. All issues of the fol­low­ing pub­li­ca­tions from Jan­u­ary 2005 were part of this analy­sis, namely: CIO, CIO Insight, Win­dows IT Pro, Com­mu­ni­ca­tions of the ACM, Infor­ma­tion Secu­rity Jour­nal: A Global Per­spec­tive, Inter­na­tional Jour­nal of Infor­ma­tion Secu­rity, and Jour­nal of Infor­ma­tion Sys­tem Secu­rity. These sources com­bined pub­li­ca­tions mar­keted towards prac­ti­tion­ers and those mar­keted towards aca­d­e­mics, and over­lapped secu­rity issues with top man­age­ment issues. The total num­ber of arti­cles ana­lyzed amounted to 1098, dis­trib­uted by source as illus­trated in Table 3.

Table 3: Num­ber of Arti­cles Ana­lyzed by Source

Table 3

Many of the arti­cles dis­cussed con­cerns that were beyond the scope of this inves­ti­ga­tion or did not add any key issue at all, so they were dis­carded. To arrive at the final list of issues, we iden­ti­fied, aggre­gated, and described those issues. An iter­a­tive approach was used in which new key­words were added to a list and described in con­text. The list of key­words and their con­texts were ana­lyzed in order to inter­pret their pur­pose. From this step we obtained a list of issues per­ti­nent to ISS Man­age­ment. Finally, we needed to ensure that the list was solid in terms of con­tent. A small group of infor­ma­tion secu­rity experts from the indus­try par­tic­i­pated in an ini­tial val­i­da­tion of the list before start­ing the Del­phi rounds. Nonethe­less, in the first round of the Del­phi all par­tic­i­pants were asked to val­i­date and sug­gest changes to the list of issues, in case they felt that the issues described were not rep­re­sen­ta­tive of their real­ity. From the process of lit­er­a­ture analy­sis and con­sol­i­da­tion of poten­tial ISS con­cerns resulted a list of 25 key issues that formed the first sur­vey pre­sented to the Del­phi panel (cf. Appen­dix A).

Delphi Rounds

The Del­phi study involved three rounds. Table 4 sum­ma­rizes the rounds in terms of dura­tion (num­ber of days that the round was open), num­ber of experts inquired in each round, num­ber of respon­dents and cor­re­spond­ing response rate per round, and Kendall’s W and T val­ues.

Table 4: Sum­mary of Del­phi Rounds

Table 4

Table 5 illus­trates the dis­tri­bu­tion of respon­dents by indus­try through the Del­phi rounds. This dis­tri­bu­tion can be con­sid­ered sta­ble dur­ing the rounds and does not com­pro­mise the het­ero­gene­ity of the panel over time or the direct com­par­a­tive analy­sis.

Table 5: Dis­tri­b­u­tion of Respon­dents by Indus­try through Del­phi Rounds

Table 5

On the first round, a sin­gle new issue was intro­duced:Ensure an appro­pri­ate level of tol­er­ance to Infor­ma­tion Sys­tems Secu­rity inci­dents”, increas­ing the num­ber of key issues addressed in the study to 26. Fur­ther­more, the descrip­tion of issueAlign Infor­ma­tion Sys­tems Secu­rity poli­cies with busi­ness strat­egy” was changed to match a sug­ges­tion from one respon­dent. For the sec­ond and third rounds, only the respon­dents of the first round were con­tacted. The inclu­sion of non-respon­dents in later stages of the study could bring entropy to the results since the ele­ments would not be famil­iar with the tech­nique and issues.

Fig­ure 1 depicts the evo­lu­tion of the rank of each key issue through­out the rounds (for the dis­cus­sion ahead, it is not nec­es­sary to be aware of the cor­re­spon­dence between each line and a given key issue). The evo­lu­tion of each key issue rank through­out the three Del­phi rounds can be found in Appen­dix A. It should be noted that on the first round the issues were con­sid­ered indi­vid­u­ally by the experts. How­ever, for sub­se­quent rounds, experts had the infor­ma­tion from the pre­vi­ous round, explain­ing the strong vari­a­tion of some issues between the first two rounds. It can be seen, how­ever, that the steep­ness of the lines dimin­ishes between the sec­ond and third rounds, evi­denc­ing the increased agree­ment between the
respon­dents (less adjust­ments were needed).

Figure 1

Fig­ure 1. Evo­lu­tion of Key Issues through Del­phi Rounds

Interviews

After com­plet­ing the Del­phi rounds, we con­sid­ered it rel­e­vant to add qual­i­ta­tive data to the study. All respon­dents to the sec­ond and third rounds were con­tacted and invited to com­ment on the results and to assess the study. Fif­teen par­tic­i­pants made them­selves avail­able, rep­re­sent­ing about 63% of the respon­dents of the third round and 30% of the respon­dents of the sec­ond round. The aver­age work expe­ri­ence in the area of ISS of these inter­viewed pro­fes­sion­als was 11.6 years.

This fourth stage of research con­sisted of semi-struc­tured inter­views with mem­bers of the panel. These inter­views were done in loco with trips of the first author to the orga­ni­za­tions in which the experts were oper­at­ing. With one excep­tion, all inter­views were audio-taped so that the researcher could con­cen­trate on the con­ver­sa­tion and not on the task of col­lect­ing notes.

The inter­views aimed to deter­mine the spe­cific rea­son­ing under­ly­ing the experts’ responses. Due to time con­straints, the inter­views focused only on the moti­va­tions asso­ci­ated with the key issues that were cho­sen by the respon­dent as the top 3 most impor­tant issues and the bot­tom 3 least impor­tant issues. The experts were also queried on their opin­ion about the rel­e­vance of the study. With­out excep­tion they high­lighted the dif­fi­culty in sort­ing some of the issues, not­ing also the time­li­ness and up-to-date qual­ity of the sur­vey, assert­ing that it sum­ma­rizes the con­cerns they face or expect to face in 5 to 10 years, regard­ing ISS Man­age­ment.

Analysis

In this sec­tion we ana­lyze the study in terms of response rate, level of con­sen­sus achieved, pos­si­bil­ity of ran­dom answers by par­tic­i­pants, pos­si­bil­ity of clus­ter­ing respon­dents, and con­tent of inter­views.

Response Rate

The response rate for the first round (27.5%) is accept­able since there was no pre­vi­ous agree­ment with the experts to par­tic­i­pate in the study. This rep­re­sents 50 experts that par­tic­i­pated in the first round. Usu­ally, the response rate for Del­phi stud­ies is between 40% and 50% [Lin­stone and Tur­off 1975]. Our sub­se­quent rounds achieved a response rate of 64% and 48%.

Consensus

Qual­i­ta­tively, for the first two rounds, using the inter­pre­ta­tion pro­posed by Schmidt [1997], a weak agree­ment was found between panel mem­bers. For the third round, there was mod­er­ate con­sen­sus. Addi­tion­ally, there was a grow­ing sta­bil­ity between the first and sec­ond rounds, and between the sec­ond and third rounds (T took val­ues of 0.649 and 0.830, respec­tively). This lon­gi­tu­di­nal sta­bil­ity reveals the pro­gres­sion of the result, i.e., the increas­ing of the cor­re­la­tion between rounds. These cal­cu­la­tions can be observed in Fig­ure 1, where for 24 of the 26 issues, the slope (in absolute value) of the lines of pro­gres­sion between Round 1 and Round 2 is supe­rior to the slope between Round 2 and Round 3 for each key issue. This sim­ple analy­sis shows that there was less need to make adjust­ments between the sec­ond and third rounds com­pared to the nec­es­sary adjust­ments between the first and sec­ond rounds.

Possibility of Random Answers

To judge the like­li­hood of ran­dom answers, an analy­sis was made between the responses of experts in a given round and their own responses in the prece­dent round as well as the group’s response. The met­ric used was Kendall’s T. It was found that there is some level of agree­ment between the responses of each expert in a spe­cific round and the response of the group for the round that pre­ceded it. These obser­va­tions rule out the pos­si­bil­ity that ran­dom answers might have been given. It can also be said that the response of each expert is some­what change­able over time, sug­gest­ing a review of the indi­vid­ual rank­ing against the pre­ced­ing panel rank­ing, as is nor­mally expected in Del­phi stud­ies.

Possibility of Clustering

Given the het­ero­gene­ity of par­tic­i­pants in terms of indus­try, it made sense to con­sider poten­tial dif­fer­ences between groups or clus­ters of respon­dents. The analy­sis made on each indus­try showed aweak” agree­ment for all the rounds. There might also be other groups that could not be log­i­cally iden­ti­fied from the data. To rule out this pos­si­bil­ity we applied clus­ter analy­sis (Ward’s Method) aim­ing to clas­sify a col­lec­tion of experts that shared the same opin­ion. No rela­tion­ship between these ele­ments, either with data col­lected in the Del­phi study or with addi­tional data col­lected in inter­views, was found.

Interviews

After inter­view­ing three-quar­ters of the respon­dent panel of the third round, it can be said that the con­sen­sus of opin­ions amongst the experts is, keep­ing the inter­pre­ta­tion given so far, very strong. Their choices were always linked to aspects inher­ent to the busi­ness, while giv­ing less empha­sis to the con­trol of the behav­ior of human resources in IS, either because they do not believe in the effec­tive­ness of exist­ing behav­ioral con­trols or because they do not imple­ment such con­trols. With one excep­tion, all experts were com­mit­ted to the align­ment with busi­ness, and the jus­ti­fi­ca­tions they pro­vided for their rank­ings are inter­pre­ta­tions of this clause in light of their real­ity. One such expert explained thatsecu­rity sys­tems don’t exist just for myself (...) they exist if they allow us to con­tinue to do busi­ness (...) we face it as the main­te­nance of a fac­tory, if I don’t do main­te­nance of the machines, they will even­tu­ally stop”. Another expert stressed thatwhat has to be guar­an­teed is the con­ti­nu­ity of the busi­ness (...) guar­an­tee­ing the involve­ment of top man­age­ment is essen­tial”.

Although the con­trol of human resources is not a top pri­or­ity for par­tic­i­pants, experts were explicit about the fre­quent aware­ness-rais­ing of the orga­ni­za­tion’s employ­ees regard­ing infor­ma­tion secu­rity adverse events. As sev­eral experts put it:some­one will be able to vio­late that con­trol”,we have to pro­tect our­selves in the best pos­si­ble way with­out assum­ing the con­trol”,con­trol’s impor­tance is very nar­row (...) we either trust, or don’t trust peo­ple” andthe edu­ca­tion process is of extreme impor­tance”.

The dif­fer­ent real­i­ties of the experts, their busi­nesses, the mar­kets in which their orga­ni­za­tions oper­ate, and the respon­si­bil­i­ties they face pro­vide dif­fer­ent inter­pre­ta­tions to the same key issue. Although there was a descrip­tion for each issue, the expe­ri­ence of each expert pro­vided a back­ground against which issues were inter­preted from the per­spec­tive of their own busi­nesses.

The key issueEnsure the abil­ity to recover infor­ma­tion sys­tems/infor­ma­tion” is main­tained through­out the three rounds as the top pri­or­ity. Experts con­sid­ered that the con­ti­nu­ity of the busi­ness is para­mount. For the experts inter­viewed, this is a uni­ver­sal con­cern and it always under­lies the first key issues they selected. The expla­na­tion pro­vided for the promi­nence of that key issue derives from the inabil­ity to pro­tect IS against 100% of threats, there­fore, it is nec­es­sary to main­tain the pos­si­bil­ity of recov­er­ing the Infor­ma­tion Sys­tem in case of loss or inter­rup­tion.

The inter­views revealed that 14 out of 15 experts explained their choices based on the align­ment with their busi­nesses, with experts keep­ing in mind the impor­tance of con­ti­nu­ity as well as the use­ful­ness of secu­rity in the con­text in which their busi­nesses oper­ate. With one excep­tion, all experts agreed that the answers given were based on the intense con­sid­er­a­tion of their pro­fes­sional real­i­ties. The inter­pre­ta­tion of each ques­tion in the light of their real­ity deter­mined the rel­a­tive rank­ing of the issues. Some experts men­tioned that cer­tain aspects do not apply to all busi­nesses and are depen­dent on the per­cep­tion of dif­fer­ent orga­ni­za­tional cul­tures. This sug­gests that the infor­ma­tion secu­rity cul­ture of the orga­ni­za­tion and its matu­rity level in terms of infor­ma­tion sys­tems secu­rity man­age­ment may play an impor­tant role in pri­or­i­tiz­ing ISS con­cerns.

Discussion

The ordered list does not show a log­i­cal group­ing on the nature of the issues. In fact, the list is quite sparse in that mat­ter: there are issues per­tain­ing to dif­fer­ent types of con­trols (tech­ni­cal, for­mal, infor­mal, and reg­u­la­tory), as well as issues related to dif­fer­ent classes of con­trols (direct­ing, struc­tur­ing, learn­ing, pre­vent­ing, detect­ing, and react­ing). There­fore, we chose to focus the dis­cus­sion in the top and bot­tom five key issues of the ranked list and their con­nec­tion with what was found in the inter­views.

The first topic that emerges isEnsure the abil­ity to recover infor­ma­tion sys­tems/infor­ma­tion”. This has a clear com­po­nent of reac­tion to dis­rup­tions of busi­ness oper­a­tions. This goes accord­ingly to con­cerns dis­cussed in the inter­views. Bear­ing in mind the three con­ven­tional prin­ci­ples of infor­ma­tion secu­rity—con­fi­den­tial­ity, integrity, and avail­abil­ity—we may con­clude that, at least for this Del­phi panel, the main con­cern lies in being able to respond to with­hold­ing of data or IS resources. This result pro­vides empir­i­cal sup­port for the claim made by Parker [1998, p. 212] thatIn busi­ness, the first pri­or­ity of secu­rity is to assure avail­abil­ity; if we do not have infor­ma­tion avail­able, its integrity and con­fi­den­tial­ity are of no con­cern.”

The sec­ond topic has a detec­tion com­po­nent and aims toDetect Infor­ma­tion Sys­tems Secu­rity anom­alies (intru­sions, breaches, attacks, etc.)”. This aspect once again reflects the fear of com­pro­mis­ing crit­i­cal busi­ness data that may affect the proper func­tion­ing of the orga­ni­za­tion. Only if we detect the occur­rence of an adverse event in infor­ma­tion secu­rity will we be able to react and to learn in order to improve IS pro­tec­tion. Given the com­plex­ity of today’s busi­ness, where there is a need for effec­tive IS gov­er­nance, care­fully engi­neered IT envi­ron­ments, and atten­tion to risk [West­er­man and Hunter 2007], the abil­ity to detect ISS anom­alies, both via peo­ple and via tech­ni­cal devices, is of para­mount impor­tance.

In third place comes a mea­sure related to the direc­tion of the ISS effort,Get the com­mit­ment of top man­age­ment to the Infor­ma­tion Sys­tems Secu­rity pro­gram in terms of man­age­ment direc­tion and allo­ca­tion of resources.” This empha­sizes the need to have sup­port for ISS ini­tia­tives from top man­age­ment not only in terms of bud­get, but also their aware­ness and their involve­ment. This is espe­cially true given the silent trait of a good ISS man­age­ment (e.g., absence of ISS crises in an orga­ni­za­tion should not lead by itself to cuts in the ISS bud­get) and the cur­rent need for IT teams to do more with less [Luft­man and Derk­sen 2012], get­ting or main­tain­ing the com­mit­ment of top man­age­ment to the ISS efforts will con­tinue to be a chal­lenge in the upcom­ing years.

The fourth issue,Val­i­date the effec­tive­ness of the imple­mented Infor­ma­tion Sys­tems Secu­rity mea­sures”, is intended to con­firm that ISS con­trols effec­tively play the role for which they were designed. It is a process that serves to demon­strate the use­ful­ness of ISS con­trols, help­ing the orga­ni­za­tion to assess their real value and con­tri­bu­tion to the pro­tec­tion efforts of infor­ma­tion sys­tems. Indeed, there is a util­i­tar­ian con­nec­tion between this issue and the pre­vi­ous: if an ISS man­ager is not able to show to top man­age­ment that ISS con­trols are effec­tive and enable the busi­ness oper­a­tions of the orga­ni­za­tion, it will be hard to get their com­mit­ment to the ISS pro­gram.

Finally, the fifth pri­or­ity,Align Infor­ma­tion Sys­tems Secu­rity poli­cies with busi­ness strat­egy”, aims to pro­vide a direc­tion and the first guide­line to struc­ture the ISS effort within the orga­ni­za­tion. It is this align­ment that jus­ti­fies the imple­men­ta­tion of ISS con­trols. It was clear from the inter­views that experts are fully aware that ISS is only jus­ti­fied in light of the busi­ness and that any secu­rity con­trol, no mat­ter how sophis­ti­cated from a tech­ni­cal point of view it may be, has to prove its util­ity and con­tri­bu­tion to the busi­ness.

As found in the inter­views, at the bot­tom of the rank­ing is a list of issues that relate to con­trol (Ban the access in the orga­ni­za­tion to Inter­net con­tent with poten­tial risk of caus­ing Infor­ma­tion Sys­tems Secu­rity breaches”–rank 25) and issues that are not con­nected to the align­ment of ISS with busi­ness strat­egy, such asMake con­tract ven­dors and ser­vice providers respon­si­ble in the event of Infor­ma­tion Sys­tems Secu­rity breaches”,Obtain cer­ti­fi­ca­tion of sys­tems and pro­ce­dures in accor­dance with inter­na­tional Infor­ma­tion Sys­tems Secu­rity stan­dards”,Jus­tify invest­ments in Infor­ma­tion Sys­tems Secu­rity to top man­age­ment”, andInte­grate in the orga­ni­za­tion new Infor­ma­tion Sys­tems Secu­rity pro­ce­dures and prod­ucts.” We advance the fol­low­ing expla­na­tions for the ranks of these issues.

The con­trac­tual lia­bil­ity is neglected since it is unre­al­is­tic in the Por­tuguese legal sys­tem, with the experts stat­ing that the com­pany would col­lapse before being able to recover any loss due to the instal­la­tion or oper­a­tion of a prod­uct or ser­vice. Cer­tifi­cates are seen as mere sym­bols to cre­ate orga­ni­za­tional image that are not nec­es­sary to have in order to meet the require­ments demanded by those same cer­ti­fi­ca­tions. Inter­vie­wees also added that a num­ber of pro­ce­dures on these cer­ti­fi­ca­tions do not apply to some busi­nesses or even impair the func­tion­ing of said busi­ness, mak­ing their busi­ness processes slow or hin­der­ing their flow with­out any return. The issue that focuses on the jus­ti­fi­ca­tion of invest­ments at first seems con­tra­dic­tory, yet such is not the case in light of the inter­pre­ta­tion pro­vided by the vast major­ity of the experts inter­viewed. If the invest­ment is aligned with busi­ness, the jus­ti­fi­ca­tion is not con­sid­ered nec­es­sary since it is the very require­ment for top man­age­ment to meet its strate­gic objec­tives. On the other hand, if the invest­ment is mis­aligned with busi­ness, then it is wrong to try to jus­tify it, as those invest­ment require­ments are unfounded, since they are not really nec­es­sary. The fact that this issue showed a lower pri­or­ity may sug­gest that the major­ity of the panel mem­bers are part of orga­ni­za­tions whose man­age­ment has strate­gic per­cep­tions and con­cerns about ISS. Finally, the inte­gra­tion of new pro­ce­dures and prod­ucts has rel­a­tive impor­tance, that is, if it is nec­es­sary for the busi­ness then it is cov­ered by the align­ment (which is a pri­or­ity); if it is not needed, then there is no jus­ti­fi­ca­tion for inte­grat­ing new pro­ce­dures or prod­ucts just because they are being mar­keted. We think that these first and last five issues describe very well the rea­son­ing and the ground­ing struc­ture for answer­ing the research ques­tion that guided this study.

Conclusion

As the depen­dence of orga­ni­za­tions on infor­ma­tion grows so does the need to pro­tect IS assets. Pro­fes­sion­als who are in charge of the man­age­ment of IS pro­tec­tion face a set of con­cerns which this research con­sid­ered impor­tant to iden­tify and pri­or­i­tize. Accord­ingly, a num­ber of con­cerns were found as well as their respec­tive order­ing and con­sol­i­da­tion. In addi­tion, a qual­i­ta­tive inter­pre­ta­tion was pro­vided to iden­tify the rea­sons that sup­port the impor­tance given to each con­cern.

Limitations

As with any study, this one also presents sev­eral lim­i­ta­tions. The first lim­i­ta­tion relates to the way the sur­vey was designed. Although the sur­vey went through iter­a­tive revi­sions and the first Del­phi round was open to the intro­duc­tion of new issues, some ques­tions may com­prise a larger scope than oth­ers. Val­i­da­tion dur­ing inter­views mit­i­gated this pos­si­bil­ity, but it did not elim­i­nate it alto­gether. A sec­ond lim­i­ta­tion results from the sub­jec­tive inter­pre­ta­tion of key issues by experts. Although the descrip­tions asso­ci­ated with each of the key issues were pro­vided to lessen this fac­tor, there still may be room for dif­fer­ences of inter­pre­ta­tion. Sub­jec­tiv­ity in this study was assessed using the Q-sort tech­nique. This tech­nique has lim­i­ta­tions, includ­ing forc­ing a clas­si­fi­ca­tion of issues with­out ties, pre­vent­ing an expert to clas­sify two issues at the same level of impor­tance. The third lim­i­ta­tion we iden­ti­fied is also a require­ment of the study: the het­ero­gene­ity of the panel. Although this con­di­tion was nec­es­sary for answer­ing the research ques­tion, het­ero­ge­neous pan­els tend to con­verge more slowly to con­sen­sus. The last lim­i­ta­tion relates to the study being geo­graph­i­cally con­fined to Por­tu­gal, which lim­its the extrap­o­la­tion of results to other set­tings, as other cul­tures may clas­sify issues from a dif­fer­ent per­spec­tive.

Future Work

Regard­ing future work, the imme­di­ate sug­ges­tion is to repli­cate this study peri­od­i­cally, check­ing for the evo­lu­tion of the ISS man­age­ment con­cerns. Sim­i­lar stud­ies could also be con­ducted by restrict­ing respon­dents to spe­cific indus­tries. These stud­ies would pro­vide data that could be used to com­pare ISS man­age­ment con­cerns across sev­eral areas. Alter­na­tively, the matu­rity level of ISS man­age­ment prac­tices of orga­ni­za­tions could be exam­ined in order to relate it with the rank­ing of key issues, as matu­rity may influ­ence rank­ing. Finally, we believe that it would be impor­tant to repli­cate this study in other regions and cul­tures. The lack offaith” in the Por­tuguese judi­cial sys­tem, as pre­vi­ously dis­cussed, serves as indi­ca­tion that other cul­tures may per­ceive ISS man­age­ment key issues dif­fer­ently.

Contributions

Aware of the lim­i­ta­tions inher­ent to any study, with the data obtained dur­ing the Del­phi rounds and in the inter­views to clar­ify the moti­va­tions of experts, we argue that the key issues that ISS man­agers cur­rently face are out­lined in Appen­dix A, although their pri­or­i­ties depend on the busi­ness and on the align­ment of ISS with the busi­ness strat­egy. Fur­ther­more, we claim that despite its lim­i­ta­tions, the expected con­tri­bu­tions of the study were achieved. The ordered list pro­vides a solid con­tri­bu­tion to the under­stand­ing of the key aspects that rule the ISS man­ager’s job and may be use­ful for dif­fer­ent stake­hold­ers. Besides being help­ful in the analy­sis of IS strate­gic invest­ments by top man­age­ment, whether related to pro­to­cols, prod­ucts, or human resources across indus­tries, the list of issues may assist in the dia­log between IS man­agers and ISS man­agers, con­tribut­ing to mutual under­stand­ing of respon­si­bil­i­ties and pri­or­i­ties. The con­sid­er­a­tion of the ranked list of issues may prompt ISS devel­op­ers and con­sul­tants to eval­u­ate their cur­rent offers in terms of prod­ucts and ser­vices, as well as to gain a bet­ter under­stand­ing of why some issues are deval­ued by cur­rent ISS man­agers. In a sim­i­lar vein, ISS researchers may find poten­tial areas of research where deeper inves­ti­ga­tion and new per­spec­tives may be jus­ti­fied, such as demon­strat­ing what the value of ISS, hav­ing more sophis­ti­cated and com­pre­hen­sive detec­tion tech­niques, why behav­ioral con­trols need to improve their effec­tive­ness, or how to expand the ben­e­fits of ISS cer­ti­fi­ca­tion for orga­ni­za­tions.

Acknowledgments

This work is funded by FEDER funds through Pro­grama Opera­cional Fatores de Com­pet­i­tivi­dade—COM­PETE and National funds by FCT—Fun­dação para a Ciên­cia e Tec­nolo­gia under Pro­ject FCOMP-01-0124-FEDER-022674.

Appendix A—Results by Delphi Rounds

Issues sorted accord­ing to the final rank­ing.
† Issue sug­gested by one par­tic­i­pant in the first round.

Table Appendix

References