Stockholms universitet

Liane Rose ColonnaBitr. lektor, docent

Om mig

Liane är för närvarande anställd som Biträdande Lektor i rättsinformatik vid Juridiska institutionen, Stockholms universitet där hon forskar inom Wallenberg AI, Autonomous Systems and Software Program – Humanities and Society-programmet (WASP).

Läs mer om Liane på den engelska sidan.

Forskningsprojekt

Publikationer

I urval från Stockholms universitets publikationsdatabas

  • Artificial Intelligence in Higher Education: Towards a More Relational Approach

    2022. Liane Colonna. Journal of Regulatory Compliance VIII, 18-54

    Artikel

    To contribute to the emerging discipline of Responsible Artificial Intelligence (AI), this paper seeks to determine in more detail what responsibility means within the context of the deployment of AI in the Higher Education (HE) context. More, specifically, it seeks to disentangle the boundaries of legal responsibilities within a complex system of humans and technology to understand more clearly who is responsible and for what under the law when it comes to use of facial recognition technology (FRT) in this context. The focus of the paper is on examining the critical role and distinct nature of Ed Tech in providing FRT to the HE. Apply relational ethics theory, it asks what the legal obligations of Ed Tech product and service developers (private organizations) are in relation to the universities (public and private authorities involved in teaching and research), teachers, students and other stakeholders who utilize these AI-driven tools. 

    Läs mer om Artificial Intelligence in Higher Education
  • Implementing Data Protection by Design in the Ed Tech Context: What is the Role of Technology Providers?

    2022. Liane Colonna. Journal of Law, Technology & the Internet (JOLTI) 13 (1), 84-106

    Artikel

    This article explores the specific roles and responsibilities of technology providers when it comes to implementing Data Protection by Design (“DPbD”) and Data Protection by Default (“DPbDf”). As an example, it looks at the Education Technology (“Ed Tech”) sector and the complexities of the supply chains that exist therein to highlight that, in addition to the Higher Education (“HE”) institutions that procure products and services for advancing teaching and learning, Ed Tech vendors may also have responsibility and liability for the processing of student’s personal data. Ultimately, this paper asks whether there are any legal gaps, ambiguities, or normative conflicts to the extent that technology providers can have responsibility in contemporary data processing activities yet escape potential liability where it concerns issues of General Data Protection Regulation (“GDPR”) compliance.

    This paper argues that there is befuddlement concerning the determination of which parties are responsible for meeting DPbD and DPbDf obligations, as well as with regards to the extent of this responsibility. In some cases, an Ed Tech provider is a controller or processor in practice together with a HE institution, yet, in others it, may not have any legal responsibility to support the development of privacy and data-protection preserving systems, notwithstanding the fact it might be much more knowledgeable than a HE institution that has procured the Ed Tech product or service about the state-of-the art of the technology. Even in cases where it is clear that an Ed Tech provider does have responsibility as a controller or processor, it is unclear how it should share DPbD obligations and coordinate actions with HE

    institutions, especially when the Ed Tech supplier may only be involved in a limited way or at a minor phase in the processing of student data. There is an urgent need to recognize the complex, interdependent, and nonlinear context of contemporary data processing where there exists many different controllers, processors, and other actors, processing personal data in different geographical locations and at different points in time for both central and peripheral purposes. Likewise, the complexity of the supply of software must also be emphasized, particularly in contexts such as the supply of educational technology where technology providers can play a key role in the preservation of privacy and data protection rights but may only have a tangential link to the universities that ultimately use their products and services. There is also a need for a more dynamic approach of considering responsibility regarding DPbD. Instead of thinking about responsibilities in terms of “purpose” and “means” the law should shift towards a focus on powers and capacities. The law should also clarify whether technology providers must notify controllers about changes to the state-of-the-art and, if so, to what extent.

    Läs mer om Implementing Data Protection by Design in the Ed Tech Context
  • Reflections on the Use of AI in the Legal Domain

    2021. Liane Colonna. Law and Business 1 (1), 1-10

    Artikel

    This paper examines the field of Artificial Intelligence (AI) and Law and offers some broad reflections on its current state. First, the paper introduces the concept of AI, paying particular attention to the distinction between hard and soft AI. Next, it considers how AI can be used to support (or replace!) legal work and legal reasoning. The paper goes on to explore applications of AI in the legal domain and concludes with some critical reflections on the use of AI in the legal context.

    Läs mer om Reflections on the Use of AI in the Legal Domain
  • A Methodological Approach to Privacy by Design within the Context of Lifelogging Technologies

    2020. Alex Mihailidis, Liane Colonna. Rutgers Computer & Technology Law Journal 46 (1), 1-52

    Artikel

    Lifelogging technologies promise to manage many of the concerns raised by population aging. The technology can be used to predict and prevent disease, provide personalized healthcare, and to give support to formal and informal caregivers. Although lifelogging technologies offer major opportunities to improve efficiency and care in the healthcare setting, there are many aspects of these devices that raise serious privacy concerns that can undercut their use and further development. One way to manage privacy concerns raised by lifelogging technologies is through the application of Privacy by Design, an approach that involves embedding legal rules into information systems at the outset of their development. Many current approaches to Privacy by Design, however, lack methodological rigor, leaving stakeholders perplexed about how to achieve the objectives underlying the concept in practice.

    This paper will explore ways to develop a Privacy by Design methodology within the context of Ambient Assistive Living (AAL) technologies like lifelogging. It will set forth a concrete, methodological approach towards incorporating privacy into all stages of a lifelogging system's development. The methodology begins with a contextual understanding of privacy, relying on theoretical and empirical studies conducted by experts in humancomputer relations. It then involves an analysis of the relevant black-letter law. A systematic approach as to how to incorporate the requisite legal rules into lifelogging devices is then presented, taking into the account the specific design elements of these kinds of systems.

    Läs mer om A Methodological Approach to Privacy by Design within the Context of Lifelogging Technologies
  • Privacy, Risk, Anonymization and Data Sharing in the Internet of Health Things

    2020. Liane Colonna. Pittsburgh Journal of Technology Law & Policy 20 (1), 148-175

    Artikel

    This paper explores a specific risk-mitigation strategy to reduce privacy concerns in the Internet of Health Things (IoHT): data anonymization. It contributes to the current academic debate surrounding the role of anonymization in the IoHT by evaluating how data controllers can balance privacy risks against the quality of output data and select the appropriate privacy model that achieves the aims underlying the concept of Privacy by Design. It sets forth several approaches for identifying the risk of re-identification in the IoHT as well as explores the potential for synthetic data generation to be used as an alternative method to anonymization for data sharing.

    Läs mer om Privacy, Risk, Anonymization and Data Sharing in the Internet of Health Things
  • Two Faces of Privacy: Legal and Human-Centered Perspectives of Lifelogging Applications in Home Environments

    2020. Wiktoria Wilkowska (et al.). Human Aspects of IT for the Aged Population. Healthy and Active Aging, 545-564

    Konferens

    In view of the consequences resulting from the demographic change, using assisting lifelogging technologies in domestic environments represents one potential approach to support elderly and people in need of care to stay longer within their own home. Yet, the handling of personal data poses a considerable challenge to the perceptions of privacy and data security, and therefore for an accepted use in this regard. The present study focuses on aspects of data management in the context of two different lifelogging applications, considering a legal and a human-centered perspective. In a two-step empirical process, consisting of qualitative interviews and an online survey, these aspects were explored and evaluated by a representative German sample of adult participants (N = 209). Findings show positive attitudes towards using lifelogging, but there are also high requirements on privacy and data security as well as anonymization of the data. In addition, the study allows deep insights into preferred duration and location of the data storage, and permissions to access the personal information from third parties. Knowledge of preferences and requirements in the area of data management from the legal and human-centered perspectives is crucial for lifelogging and must be considered in applications that support people in their daily living at home. Outcomes of the present study considerably contribute to the understanding of an optimal infrastructure of the accepted and willingly utilized lifelogging applications.

    Läs mer om Two Faces of Privacy
  • Legal and regulatory challenges to utilizing lifelogging technologies for the frail and sick

    2019. Liane Colonna. International Journal of Law and Information Technology 27 (1), 50-74

    Artikel

    Lifelogging technologies have the capacity to transform the health and social care landscape in a way that few could have imagined. Indeed, the emergence of lifelogging technologies within the context of healthcare presents incredible opportunities to diagnose illnesses, engage in preventative medicine, manage healthcare costs and allow the elderly to live on their own for longer periods. These technologies, however, require coherent legal regulation in order to ensure, among other things, the safety of the device and privacy of the individual. When producing lifelogging technologies, it is important that developers understand the legal framework in order to create a legally compliant device. The current regulation of lifelogging is highly fragmented, consisting of a complex patchwork of laws. There are also a number of different regulatory agencies involved. Laws and regulations vary, depending on jurisdiction, making development of these technologies more challenging, particularly given the fact that many lifelogging tools have an international dimension.

    Läs mer om Legal and regulatory challenges to utilizing lifelogging technologies for the frail and sick
  • Privacy-Aware and Acceptable Lifelogging Services for Older and Frail People: the PAAL Project

    2018. Francisco Flórez-Revuelta (et al.). Proceedings 2018 IEEE 8th International Conference onConsumer Electronics - Berlin (ICCE-Berlin)

    Konferens

    Developed countries around the world are facing crucial challenges regarding health and social care because of the demographic change and current economic context. Innovation in technologies and services for Active and Assisted Living stands out as a promising solution to address these challenges, while profiting from the economic opportunities. For instance, lifelogging technologies may enable and motivate individuals to pervasively capture data about them, their environment and the people with whom they interact, in order to receive a variety of services to increase their health, well-being and independence. In this context, the PAAL project presented in this paper has been conceived, with a manifold aim: to increase the awareness about the ethical, legal, social and privacy issues associated to lifelogging technologies; to propose privacy-aware lifelogging services for older people, evaluating their acceptability issues and barriers to familiarity with technology; to develop specific applications referred to relevant use cases for older and frail people.

    Läs mer om Privacy-Aware and Acceptable Lifelogging Services for Older and Frail People
  • Europe Versus Facebook: An Imbroglio of EU Data Protection Issues

    2016. Liane Colonna. Data Protection on the Move, 25-50

    Kapitel

    In this paper, the case Europe versus Facebook is presented as a microcosm of the modern data protection challenges that arise from globalization, technological progress and seamless cross-border flows of personal data. It aims to shed light on a number of sensitive issues closely related to the case, which namely surround how to delimit the power of a European Data Protection Authority to prevent a specific data flow to the US from the authority of the European Commission to find the entire EU-US Safe Harbor Agreement invalid. This comment will also consider whether the entire matter might have been more clear-cut if Europe-versus-Facebook had asserted its claims against Facebook US directly pursuant to Article 4 of the EU Data Protection Directive, rather than through Facebook Ireland indirectly under the Safe Harbor Agreement.

    Läs mer om Europe Versus Facebook
  • Legal Implications of Data Mining: Assessing the European Union’s Data Protection Principles in Light of the United States Government’s National Intelligence Data Mining Practices

    2016. Liane Colonna.

    Avhandling (Dok)

    This dissertation addresses some of the data protection challenges that have arisen from globalization, technological progress, terrorism and seamless cross-border flows of personal data.  The focus of the thesis is to examine ways to protect the personal data of EU citizens, which may be collected by communications service providers such as Google and Facebook, transferred to the US Government and data mined within the context of American national intelligence surveillance programs.  The work explores the technology of data mining and examines whether there are sufficient guarantees under US law for the rights of non-US persons when it comes to applying this technology in the national-security context.

    Läs mer om Legal Implications of Data Mining
  • Schrems vs. Commissioner: A Precedent for the CJEU to Intervene in the National Intelligence Surveillance Activities of Member States?

    2016. Liane Colonna. Europarättslig tidskrift (2), 208-224

    Artikel
    Läs mer om Schrems vs. Commissioner
  • Article 4 of the EU Data Protection Directive and the irrelevance of the EU–US Safe Harbor Program?

    2014. Liane Colonna. International Data Privacy Law 4 (3), 203-221

    Artikel
    • The relationship between the EU–US Safe Harbor Program and the applicable law provisions set forth in the EU Data Protection Directive and the proposed EU Data Protection Regulation requires clarification.

    • A central concern for US companies is that the benefits of enrolment in the EU–US Safe Harbor Program will be undermined by the broad assertions of extraterritorial jurisdiction made by the EU pursuant to Article 4 of the Directive/Article 3 of the proposed Regulation.

    • If the extraterritorial scope of the Directive/Regulation is widely interpreted then many US companies may lose their incentive to join the Safe Harbor Program because the major benefits of joining the Safe Harbor Program—the ability to rely on industry dispute resolution mechanisms, US law to interpret the Principles, and US courts and administrative bodies to hear claims—will be removed.

    Läs mer om Article 4 of the EU Data Protection Directive and the irrelevance of the EU–US Safe Harbor Program?
  • Data Mining and Its Paradoxical Relationship to the Purpose Limitation Principle

    2014. Liane Colonna. Reloading Data Protection, 299-321

    Kapitel

    European Union data protection law aims to protect individuals from privacy intrusions through a myriad of procedural tools that reflect a code of fair information principles. These tools empower the individual by giving him/her rights to control the processing of his/her data. One of the problems, however, with the European Union’s current reliance on fair information principles is that these tools are increasingly challenged by the technological reality. And, perhaps nowhere is this more evident than when it comes data mining, which puts the European Union data protection principles to the ultimate test. As early as 1998, commentators have noted that there is quite a paradoxical relation between data mining and some data protection principles.This paper seeks to explore this so-called paradoxical relationship further and to specifically examine how data mining calls into question the purpose limitation principle. Particular attention will be paid to how data mining defies this principle in a way that data analysis tools of the recent past do not.

    Läs mer om Data Mining and Its Paradoxical Relationship to the Purpose Limitation Principle
  • A Taxonomy and Classification of Data Mining

    2013. Liane Colonna. SMU Science and Technology Law Review 16 (2), 309-369

    Artikel

    Data is a source of power, which organizations and individuals of everyform are seeking ways to collect, control and capitalize upon.' Even thoughdata is not inherently valuable like gold or cattle, many organizations andindividuals understand, almost instinctively, that there are great possibilitiesin the vast amounts of data available to modern society. Data mining is animportant way to employ data by dynamically processing it through the useof advancing technology.The common usage of the term "data mining" is problematic becausethe term is used so variably that it is beginning to lose meaning.2 The prob-lem is partially due to the breadth and complexity of activities referred to as"data mining." This overuse, especially from the perspective of those lackinga scientific background, creates a befuddlement and alienation of the topic.As such, individuals seem to haphazardly refer to data mining without a gen-uine understanding of what this technology entails.This paper seeks to demystify data mining for lawyers through a clarifi-cation of some of its intricacies and nuances. The goal is to explain how datamining works from a technological perspective in order to lay a foundationfor understanding whether data mining is sufficiently addressed by the law.A central ambition is to look beyond the buzzword and to take a realisticview of the core attributes of data mining. In an effort to understand if thereis a need for new legal models and solutions, particular attention will be paidto exploring whether data mining is a genuinely new concept or whether it isa case of "the emperor's new clothes."

    Läs mer om A Taxonomy and Classification of Data Mining

Visa alla publikationer av Liane Rose Colonna vid Stockholms universitet