Austria
Artificial Intelligence
Introduction
In Austria, many companies are active in the field of AI, but only a small number of them are concretely engaged in the development of AI (see E. Prem, S. Ruhland, AI in Austria: An approach based on economic statistical analyses at bmk.gv.at/dam/jcr:abf0cdc3-bd4c-4335-aef9-8e5b0a33c119/ai_potential_oesterreich.pdf). Companies leading in the AI environment come from the field of software development or offer corresponding data processing, followed by business and market consultancies that use their own software to analyse corporate data, stock market data, and so on.
New focal points have emerged in the area of production and Industry 4.0 (e.g., for predictive maintenance). The Austrian state supported these research activities between 2012 and 2020 with a total of EUR 910 million in funding through their strategy Artificial Intelligence Mission Austria 2030 (AIM AT 2030).
Apart from production and industry, Austria also has a large number of research institutions that work with AI. Austrian universities and research institutions have a high level of competence and enjoy a good reputation worldwide, particularly in key AI subfields, such as machine learning, symbolic processes, robotics and autonomous systems. One of the famous institutions for AI research is the Laboratory for Artificial Intelligence (AI LAB) of the Institute of Technology (LIT) at the Johannes Kepler University in Linz that is headed by Sepp Hochreiter (a pioneer in the field of AI).
1 . Constitutional law and fundamental human rights
1.1. Domestic constitutional provisions
Currently, there are no specific constitutional provisions regulating AI systems and no decisions referring to them in Austria.
However, within the framework of the AIM AT 2030 strategy, the Austrian federal government intends to create a legal framework for placing AI on the market, putting it into operation and using it safely, thus guaranteeing the fundamental rights of citizens. The federal government has declared its intention to actively participate at an international level, in order to strengthen international legal principles for the digital space (especially within human rights and international humanitarian law), and to develop standards for the ethical use of AI in accordance with human rights.
It should be highlighted that the right to privacy is constitutionally protected in Article 8 of the Charter of fundamental rights of the European Union, according to which everyone has the right to protection of their personal data. Further, under Section 1 of the Austrian Data Protection Act everyone (both natural and legal persons) has the right to confidentiality of their personal data, in particular with regard to their private and family life, insofar as there is an interest worthy of protection. The existence of such an interest is only excluded if data is not accessible to a claim to secrecy due to its general availability or its lack of traceability to the person concerned. These constitutionally guaranteed rights oblige private persons too. Hence, AI Systems in private surroundings that are trained with personal related data, or used for personal related decisions, can infringe this right.
After all, the “right to informational self-determination” is recognised from this claim to secrecy under Section 1 paragraph 1 of the Data Protection Act. Although the right to informational self-determination is not expressly affirmed, it can be derived from human dignity and general principles of the rule of law and is also recognised by the case law of the Austrian Constitutional Court (see Constitutional Court 27. 6. 2014, G 47/2012, G 59/2012, G 62/2012, G 70/2012, G 71/2012). The right to informational self-determination guarantees the freedom of the individual to decide when and within what limits personal facts of life are disclosed. If data or predictions from a person’s behaviour are made by means of or by creating links, it is precisely this right that seems to be violated, as the data subject has no influence on the information generated about them on customer affiliations, interests, behaviours or life models, which indisputably concern their private life.
Against this background, it can be particularly problematic that the data subject has no means to influence the profile/output created about him or her by an AI System. However, such circumstances have not yet been the subject of a decision.
Lastly, it must be noted that AI as such does not constitute an infringement of any human rights, only special functions (such as profiling without precise information) can have this character.
1.2. Human rights decisions and conventions
Article 8 paragraph 1 of the Charter of Fundamental Rights of the European Union provides that every person has the right to the protection of their personal data. This right is laid down in the General Data Protection Regulation (GDPR).
Furthermore, according to Article 8 of the European Convention of Human Rights, every person has the right to respect for their private and family life. Although the protection of personal data is not explicitly mentioned, the European Court of Human Rights has always interpreted the concept of "private life" broadly. For example, it has held that this term includes "the personal information which a person has a legitimate expectation will not be made public without his or her consent" (Judgment of 19 September 2013, Application No. 8772/10).
There are no decisions that refer to AI systems and the infringement of the abovementioned convention rights.
2 . Intellectual property
IP is relevant for AI systems from two perspectives:
- the protection of the components of an AI system (data, algorithm, model, software); and
- the protection of works that have been created by an AI system.
Neither the Austrian courts nor the legislator have referred to these topics so far.
2.1. Patents
Artificial intelligence regularly involves mathematical solutions that are realised in software, i.e., computer-implemented processes. In Austria, these are only accessible to patent protection to a limited extent, as computer programs are not patentable (Section 1 paragraph 3 No. 5 of the Patent Act). Rather, the programme must have a “further technical effect” (see Austrian Supreme Court 25.8.2016, 4Ob94/16a). AI-systems would therefore have to contribute to solving a concrete technical problem with technical means, such as the control of an autonomous vehicle, for a patent to be applied for.
2.2. Copyright
Protection of investments in AI-developments is only fragmented. An algorithm that is no more than a calculation rule (i.e., a mathematical expression) cannot be protected as a computer program, according to Section 40a of the Austrian Copyright Act. Further, neither of the achieved results from an AI system (the trained model and the output data), which are the most valuable components of the AI system, can be protected under the Copyright Act, due to the lack of human intervention.
According to the Austrian Supreme Court when a computer converts millions of found image files into thumbnails completely independently and without further human intervention, but only with the help of a program, this activity does not reach the threshold in the Copyright Act for a protectable work, (Austrian Supreme Court on 20 September 2011, 4 Ob 105/11m). This decision does not refer to AI systems per se, however the circumstances are comparable to such situations, where the AI entity is able to fully autonomously create a work (without human intervention). There are no regulations for such works.
If the artificially-generated work was created under any human intervention, there are already national rules for analysing who has influenced the machine in a creative manner such that they would be treated as author (or co-author). This could be, for instance, the developer of the algorithm or the entity that introduced the algorithm into the AI system. In essence it is a case-by-case decision depending on the circumstances of a case that will be determined by the national courts.
Finally it is worth mentioning that the Austrian Copyright Act does not provide rules in the sense of related rights (similar to the protection of databases) that protect investment in original algorithms and databases for deep learning.
2.3. Trade secrets/confidentiality
According to Section 26b of the Austrian Act against Unfair Competition, a trade secret is information that is:
- secret because it is neither generally known nor readily accessible, either in its entirety or in the precise arrangement and composition of its components, to persons in the circles normally concerned with this type of information;
- is of commercial value because it is secret; and
- is subject to reasonable confidentiality measures appropriate to the circumstances by the person exercising legitimate control over that information.
This broad range of potentially protectable commercial information suggests that data more generally may also be covered. Simple raw data will regularly lack the requirements under Section 26b (1) of the Unfair Competition Act, but this does not necessarily apply to the refined data in a particular compilation, which as such is not generally known or readily accessible, and which has commercial value. These criteria could be applicable to recognised patterns and weights, which are created during the training of a model for an AI system. A prerequisite for protection would be that the individual variables of the calculation within the model remain secret, which, however, could possibly collide with transparency requirements under the GDPR and the draft EU AI Act provided that personal related data are included.
Austrian courts have yet to decide in a case with such circumstances. It is therefore unclear whether the components of an AI system could be protected as trade secrets.
3 . Data
In Austria, there is no comprehensive regulation for data in the context of AI systems, neither for the handling of raw data generated by machines, which is not personal data, nor for machine-generated personal data. Likewise there are no regulations for the economic use and tradability of data. The issue is therefore largely regulated by contractual agreements.
3.1. Domestic data law treatment
According to Section 285 of the Austrian Civil Law Act, everything that is distinct from the person and exists for the use of people is called a thing in the legal sense. The prerequisite for qualification as a thing is that it is controllable and exists in a limited, available quantity; otherwise it would be common property. According to this definition, data is consistently understood as a legal thing, but there is no agreement as to whether it is generally to be treated as tangible or intangible. The predominant Austrian academic commentary assumes that data that is not stored on a data carrier is to be treated as an intangible thing.
However, in Austria, no authority or court has so far dealt decisively with the ownership of data. Although data are supposed to be marketable, largely like other intangible goods and can therefore be the subject of a purchase, it is largely unclear which title acquisition and ownership provisions on purchase actually apply.
3.2. General data protection regulation
Provided that AI systems process personal related data (be it as input or as output) the European GDPR and the national provisions in the Austrian Data Protection Act apply. AI is specifically mentioned in connection with the obligation to conduct a data protection impact assessment.
Article 35 (4) of the GDPR provides that the national supervisory authority shall draw up and publish a list of processing operations for which a data protection impact assessment must be carried out (the so-called “blacklist”). The Austrian data protection authority (the Datenschutzbehörde or DSB) has made use of this provision.
There are three facts that could be applicable to AI systems according to Section 2 paragraph 1 of this blacklist:
- Data processing which involves the evaluation or classification of natural persons, including profiling and forecasting, for purposes relating to the performance at work, economic situation, health, personal preferences and interests, reliability or conduct, location or change of location of the person and which are based solely on automated processing and are likely to produce adverse legal, physical or financial effects.
- Processing of data intended to evaluate the behaviour and other personal aspects of natural persons, which may be used by third parties to produce automated decision-making, which produces legal effects concerning the persons evaluated or similarly significantly affects them.
- Processing of data using or applying new or novel technologies or organisational solutions which make it more difficult to assess the impact on data subjects and the societal consequences, in particular through the use of artificial intelligence and the processing of biometric data, provided that the processing does not concern the mere real-time rendering of facial images.
The GDPR provisions on profiling (Art 4 No 4 GDPR) and automated individual decision-making (Art 22 GDPR) are of particular relevance for AI, as many AI systems are designed to assess any human characteristics and because Machine Learning works by definition with automated decisions.
In the context of Article 15 GDPR (right of access) the Austrian Data Protection Authority (as of 8.9.2020, DSB 2020-0.436.002), has made an official decision that the scope of the right to access also includes the right to be informed about all relevant elements of an automated decision. This is intended to ensure the comprehensibility, and correctness of the input variables in the case of the data subject as well as their currency. The information, however, does not include the logic of the algorithm, its source code, the compilation code or the complete documentation. Hence, the data protection authority recognises that certain calculation methodologies will almost certainly be subject to Know-How protection.
3.3. Open data & data sharing
Directive (EU) 2019/1024 on open data and the re-use of public sector information (recast) is applicable from 17 July 2021. It replaces Directive 2003/98/EC on the re-use of public sector information, as amended by Directive 2013/37/EU. The present draft for an Austrian Act about the further usage of public institutions’ data (Informationsweiterverwendungsgesetz (IWG)) as of April 2022 is intended to implement Directive (EU) 2019/1024.
The present draft of the IWG requires a written application for the re-use of documents of public bodies to be submitted to the public body in whose possession the requested document is held. The application must clearly state the content, scope and manner of re-use of the requested documents. With regard to personal data in such documents, reference is only made to the applicability of the GDPR. However, the draft does not provide for the criteria to be used for balancing the interests of further use and those of the data subjects for confidentiality.
Critical in the Austrian draft for the Open Data concept is therefore still the protection of the secrecy interests of natural persons. Insofar as personal data may be made publicly accessible in a legally permissible manner, a gateway is opened for the violation of the right to informational self-determination, as the data subject loses control over the further use of his or her data. Therefore, sufficient technical and organisational protective measures should be prescribed by the Austrian legislator.
3.4. Biometric data: voice data and facial recognition data
According to Section 2 paragraph 1 of the "blacklist" (see above, Section 3.2 General data protection regulation) the processing of biometric data is also subject to a privacy impact assessment.
4 . Bias and discrimination
4.1. Domestic anti-discrimination and equality legislation treatment
The principles of equal treatment in Austria are primarily laid down in the following laws:
- Federal Equal Treatment Act (Gleichbehandlungsgesetz (GlBG)) for the private sector and in other areas.
- Federal Act on Equal Treatment in the Federal Sector (Bundes-Gleichbehandlungsgesetz (B-GlBG)) for employment relationships in the federal service.
The GIBG protects against discrimination in employment on the grounds of gender (in particular with reference to marital status or whether someone has children), ethnicity, religion or belief, age or sexual orientation. Discrimination on these grounds is prohibited in the establishment of the employment relationship, in the determination of remuneration, in the granting of voluntary social benefits, in training and retraining measures, in career advancement, in particular promotions, in other conditions of employment or in the termination of employment.
The problem in the context of AI systems is that the discrimination often occurs indirectly. The unequal treatment of a person may not be obvious due to one of the above-mentioned grounds of direct discrimination, which may have disadvantageous effects. For example, an AI model is designed to weed out all part-time employees, which in the output leads to only men being included because from experience women have the majority of the part-time positions. This is indirect discrimination. The Austrian courts have not dealt with this concept in the context of AI so far.
5 . Trade, anti-trust and competition
With thanks to Gerhard Fussenegger, bpv Hügel Rechtsanwälte GmbH for authoring Section 5.
5.1. AI related anti-competitive behaviour
So far, in Austrian antitrust practice, no decision by the national competition authorities and courts addresses AI related market abuses.
In its thesis paper “Digitalization and Competition Law” (“Digitalisierung und Wettbewerbsrecht”), the Austrian Federal Competition Authority (FCA), suggests using the reversal of the burden of proof with regard to cases of abuses which are typical for the digital economy. In the FCA’s view, this appears justified where there is a prima facie case of abusive conduct or where official investigative actions quickly come up against natural or technical limits. It would then be up to the (dominant) companies to explain, with recourse to the data available to them, why a certain practice or conduct does not have any anticompetitive abusive effects. Following that suggestion, with regard to AI, the undertakings concerned, (for example), would have to prove that their algorithms do not abuse their dominant positions in the relevant market.
In its case report concerning its investigation on Amazon, the FCA, confronted with an alleged abuse of a dominant digital global player, concluded that Amazon has market power although the relevant market was not ultimately defined. The FCA referred the question to the European Commission, following which, the recommendation was that when analysing multilateral markets it was advisable to focus less on a precise market definition, but rather on a plausible theory of damages.
5.2. Domestic regulation
Again, no domestic competition rules or regulations about AI exist in Austria. With the amendments of the 2021 reform of the Austrian Cartel Act, new criteria in assessing market dominance were introduced which all address the digital economy. For example, the amended Cartel Act now refers to, inter alia, access to relevant data and the benefits of network effects as an (exemplary) criteria for market dominance.
6 . Domestic legislative developments
Austrian federal ministers presented the federal government’s strategy for artificial intelligence (AIM AT 2030) with its goals and fields of action in September 2021. The question, to be addressed by this strategy is whether the current legal framework for product liability, product safety, data protection or consumer protection is sufficient for products with embedded AI or whether new regulations are needed, especially with regard to learning AI systems.
The Austrian strategy follows the European draft of the AI law and intends to analyse, with AI regulatory sandboxes, how new innovative technologies function in the existing regulatory environment, and thereby also offers regulators the opportunity to gain important insights into whether and where there is a need for regulation.
7 . Frequently asked questions
1. How is liability regulated between the different parties in AI systems (for example, software developer, data analysts and user)?
There are no specific rules in Austria. The regulations of fault liability for the developer within a contractual relationship apply and typically (where relevant) strict liability according to the Product Liability Act caused by products in which AI systems are implemented.
2. Can we license data? Are there any specific rules?
Austria has not yet dealt with the question about licensing of data. However, within the framework of the private autonomy prevailing in Austria, each party is free to deal with its property (and therefore, also data (see Section 3. Data)), make it available to third parties and to demand payment for it.
3. Can personal related data be used for training of AI?
See Section 3. Data
3-6 PQE Corporate M&A Associate
Job location: London
Projects/Energy Associate
Job location: London
3 PQE Banking and Finance Associate, Jersey
Job location: Jersey