Initial meeting

Aug 18, 2025

Are Stablecoins Suitable as a Means of Payment in Limited Networks?

Stablecoins have been specifically regulated crypto assets since summer 2024 under the Markets in Crypto Assets Regulation (MiCAR). They can be issued either as E-Money Tokens (EMT) or as Asset Referenced Tokens (ART). In particular, the issuance of EMT or ART as an issuer is strictly and granularly regulated by MiCAR. The issuance of EMTs is reserved exclusively for electronic money institutions or credit institutions authorized in the European Union. The issuance of ARTs may only be carried out by companies explicitly authorized as ART issuers in accordance with Art. 16 ff. MiCAR or by credit institutions. However, it is not only the issuance of EMTs or ARTs that may be subject to licensing requirements. If EMTs or ARTs are accepted as means of payment, transaction support services may constitute activities subject to licensing that may not be provided without prior authorization from the competent supervisory authority, such as BaFin in Germany. It is not only MiCAR that plays a role in this, as crypto-asset services are regulated as activities subject to authorization. The provisions of the German Payment Services Supervision Act (ZAG), which is based on the requirements of the second Payment Services Directive (PSD2), may also have to be taken into account in individual cases.

E-Money Tokens are Both Crypto Assets and E-Money

Article 48(2) MiCAR provides for the special feature that E-Money Tokens are considered e-money. At the same time, however, they are also defined in Article 3(1)(7) MiCAR as crypto-assets whose value stability is to be maintained by reference to the value of an official currency. EMTs thus have a hybrid status for regulatory purposes. While they are subject to the provisions of MiCAR as a special form of crypto-asset, as electronic money within the meaning of Article 2(2) of the Second Electronic Money Directive (EMD2) and Section 1 (2) sentence 3 ZAG, they are also a form of monetary value within the meaning of Article 4 No. 25 PSD2 and can therefore be the subject of payment services requiring authorization. In this regard, the European Banking Authority (EBA) published a “no-action letter” on June 10, 2025, in which it advised national supervisory authorities in the European Union not to require compliance with the provisions of PSD2 in relation to the provision of payment services with EMT to affected companies until March 2, 2026. Currently, therefore, service providers offering customers custody or transaction-supporting services in connection with EMT must have authorization as a crypto asset service provider (CASP) pursuant to Art. 59 et seq. MiCAR. MiCAR does not provide for a sectoral exemption for limited networks, for exclusive use on enclosed business premises, or for limited ranges of services or goods. In this respect, there is no exemption from the general CASP authorization requirement in such constellations. However, for the period after March 2, 2026, business models falling under the exemptions for limited dealer networks may be able to avoid an additional licensing requirement under Section 10 (1) ZAG by making use of these exemptions.

ART are Mere Crypto Assets and are Not Subject to the ZAG

The situation is different for asset-referenced tokens in that, although they qualify as crypto assets under Article 3(1)(6) MiCAR, MiCAR does not contain any provision classifying ART as monetary amounts within the meaning of PSD2. ART are therefore not subject to the regulatory regime of the ZAG. Recital 62 MiCAR mentions, for example, that ART may pose a threat to the smooth functioning of payment systems, monetary policy transmission or monetary sovereignty, which at least places them in the realm of means of payment comparable to monetary amounts. However, in its “No-Action Letter” dated June 10, 2025, the EBA also clarified that it is of the opinion that ART should not be classified as monetary amounts within the meaning of PSD2. In light of this, based on the current legal situation, it can be assumed that only the provisions of MiCAR apply to the custody of and transaction support in connection with ART. Accordingly, service providers cannot take advantage of the exemptions under payment services law for closed networks or limited range of goods or services offerings. They must either obtain authorization under MiCAR as a CASP or seek to apply the exemptions set out in Art. 2 MiCAR to their business model, provided that this is possible in individual cases.

Attorney Dr. Lutz Auffenberg, LL.M (London).

I. https://fin-law.de

E. info@fin-law.de

subscribe to Newsletter

This Blog Article as Podcast?

The Gist of It:

Presentation

    Contact

    info@fin-law.de

    Aug 11, 2025

    AI Compliance in Companies (Part III) – Scope of the GDPR and AI Act?

    With the rapid development of artificial intelligence, companies in the European Union are facing a complex regulatory landscape that is largely shaped by two pillars: the General Data Protection Regulation (GDPR) and the new Artificial Intelligence Regulation (AI Act). While the GDPR has been regulating the handling of personal data for years and has established itself as the standard for data protection, the AI Act is now the first comprehensive regulation specifically for AI systems. At first glance, both sets of regulations appear to pursue similar goals, such as protecting fundamental rights and building trust in new technologies. But how do these two comprehensive laws relate to each other? This question becomes particularly relevant when AI systems are trained or operate on the basis of personal data. Personal data is often the “fuel” of AI systems. This dual regulation raises crucial questions: Is compliance with one regulation sufficient, or are new, overlapping obligations emerging that could lead to costly pitfalls? If companies want to rely on the use of AI, they should first clarify the differences and similarities between the GDPR and the AI Act.

    Scope of the GDPR and the AI Act

    The GDPR focuses on the processing of personal data. Personal data is any information relating to an identified or identifiable natural person (Art. 4 No. 1 GDPR). Processing therefore includes virtually any handling of personal data, from reading and storing to transferring and deleting. The GDPR is designed to be technology-neutral, which means that its provisions apply regardless of the technology used, as long as personal data is processed. In contrast, the AI Act primarily regulates AI systems and AI models themselves. An AI system is defined as a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments;(Art. 3 No. 1 AI Act). The AI Act does not directly define what an AI model is. However, Recital 97 of the Regulation states that AI models are central components of an AI system, which become an AI system through additional components such as a user interface. In simple terms, the AI model is the neural network and thus the core of the AI system.

    Differences and Similarities

    The main objective of the GDPR is to protect the fundamental rights of natural persons against risks that may arise from data processing. The GDPR requires data controllers to take both technical and organizational measures to address the risks to data subjects (Articles 25 and 32 GDPR). Personal data may only be processed in accordance with the principles laid down in the GDPR. The controller is accountable to the data subjects in this regard (Art. 5(2) GDPR). The lawfulness of processing must be assessed in each individual case. In the case of the use of new technologies, which undoubtedly includes AI, a well-documented data protection impact assessment must also be considered (Art. 35 GDPR). The AI Act aims to ensure that AI is trustworthy and secure and is developed and used in accordance with fundamental rights. The AI Act is primarily product safety law that establishes uniform rules for the placing on the market, putting into service, and use of AI systems and AI models within the EU. In its implementation, the AI Act focuses primarily on classifying AI systems and AI models into specific risk categories, which are subject to different legal frameworks. The AI Act defines risk as the combination of the probability of damage occurring and the severity of that damage (Art. 3 No. 2 AI Act). The AI Act calculates the risks posed by AI by laying down specific rules for AI technologies and their application. Although the focus of the GDPR and the AI Act is different, they are closely linked in areas where AI systems process personal data. Both laws aim to minimize risk. The AI Act complements the GDPR by addressing specific risks posed by AI technologies. Although compliance with the AI Act can also help to meet the requirements of the GDPR, AI Act compliance alone is generally not sufficient for this purpose.

    FIN LAW

    I. https://fin-law.de

    E. info@fin-law.de

    subscribe to Newsletter

    This Blog Article as Podcast?

    The Gist of It:

    Presentation

      Contact

      info@fin-law.de

      Jun 23, 2025

      The Crypto Custody Agreement According to MiCAR – What Must Crypto Custodians Mandatorily Agree Upon With Their Customers?

      The custody and management of crypto assets for others is a regulated crypto asset service under Art. 3 (1) No. 16 lit. a) MiCAR and Art. 3 (1) No. 17 MiCAR. It may therefore only be provided by companies that have been authorized as crypto asset service providers under Art. 59 MiCAR. In addition to the usual strict requirements that must be met by companies regulated under MiCAR in the European Union, such as sufficient initial regulatory capital, fit and proper managers, and proper business organization with regard to risk management, IT security, and money laundering prevention, among other things, crypto asset custodians must also fulfill specific supervisory compliance obligations. One of these special requirements for crypto custodians is the obligation to conclude a custody agreement with custody clients that includes the minimum content required under Article 75(1) MiCAR. Accordingly, MiCAR-compliant custody agreements must contain at least information on the identity of the contracting parties, a description of the type of crypto service offered, information on the custody strategy, the means of communication used and how customers authenticate themselves to the crypto custodian, the security systems used, the fees and costs, and the applicable law.

      What Exactly Must a Crypto Custody Agreement Contain in Regard to the Custody Strategy?

      MiCAR does not specify exactly what crypto custodians must agree with their custody clients with regard to the custody strategy. The development and implementation of a custody strategy is primarily a regulatory obligation that crypto custodians must demonstrate to the supervisory authorities that oversee them. Article 75(1) MiCAR, which regulates the minimum requirements for custody agreements, merely stipulates that the custody strategy is a minimum requirement for a crypto custody agreement. However, this provision is specified in more detail in Article 75(3) MiCAR, which provides for a right of custody account holders to receive a summary of the custody strategy in electronic form from their crypto custodians. In order to be able to meet this requirement, crypto custodians will have to maintain an electronic document summarizing the custody strategy. The actual agreement of the custody strategy with the customer or the attachment of the complete custody strategy, for example as an annex to the custody agreement, seems unnecessary, especially since any change to the strategy would require renegotiation or a new crypto custody agreement. This cannot have been in the interest of the MiCAR regulator. It should also be noted that, as a strategy document, the custody strategy should not contain any specific technical implementation measures or the names of employees or any third-party service providers that may be involved. A strategy generally formulates goals, objectives, and ways to achieve them.

      What Details Regarding Security Systems Must Be Agreed Upon?

      Art. 75 (1) (e) MiCAR requires that crypto custody agreements include a description of the security systems used by the custodian. In this respect, it is rather unlikely that there will be any room for negotiation, as crypto custodians will hardly be able to grant custody clients any leeway in this regard. In this regard, it is necessary to include details on the technologies used for the custody of private keys, information on any vulnerability tests and security audits carried out, the authentication mechanisms provided for clients, and other security measures used by the custodian to minimize the risk of loss of clients’ crypto assets or the associated private keys. Information may also be provided on how client crypto assets are separated from the crypto asset custodian’s own holdings in crypto assets or funds and are kept safe from insolvency. Here, too, it will not be necessary to name specific sub-custodians or banks that are used to segregate client assets. A description of the specific measures implemented by the crypto custodian to increase security for customers will in any case be sufficient for the purposes of the crypto custody agreement.

      Attorney Dr. Lutz Auffenberg, LL.M. (London)

      I. https://fin-law.de

      E. info@fin-law.de

      subscribe to Newsletter

      This Blog Article as Podcast?

      The Gist of It:

      Presentation

        Contact

        info@fin-law.de

        Jun 10, 2025

        Token Sale to Private Purchasers – Can the Issuer Freely Choose the Applicable Law for the Token Terms?

        The execution of token sales has been a strictly regulated undertaking in the European Union since the Markets in Crypto Assets Regulation (MiCAR) came into force. Issuers must prepare and publish a detailed crypto asset white paper before the token sale begins. They must also comply with regulatory requirements regarding how they market the crypto assets, how they deal with conflicts of interest, and the security of the systems and protocols they use. If the crypto assets offered by the issuer are asset-referenced tokens (ART) or e-money tokens (EMT), additional regulatory obligations apply. In contrast, issuers are relatively free to design the token terms underlying the crypto assets to be offered. Token terms can be used to link the ownership of crypto assets to a wide variety of rights. As utility tokens, they can grant their holders access rights to the issuer’s goods or services, embody access to information, voting or election rights, tokenize redemption rights in the underlying fiat currency in the case of ART or EMT, or be linked to other rights. MiCAR does set out basic requirements for the underlying contracts for certain types of crypto assets. However, the specific rights and obligations of token holders and issuers can be agreed upon in accordance with the applicable national private law. But are there limits that issuers must observe when determining the private law applicable to token terms?

        In Principle, there is a Free Choice of Law under Article 6 of the Rome-I-Regulation

        Firstly, the principle of free choice of law arising from Article 3 of the Rome-I-Regulation also applies to the creation of token terms. According to this principle, issuers of crypto assets are free to decide which law should apply to the contract with the purchasers of crypto assets when drafting their token terms. However, in the case of the offering of crypto assets to private purchasers, this principle applies only to a limited extent. This is because, according to Article 6(2) of the Rome-I-Regulation, a choice of law under Article 3 of the Rome-I-Regulation may not result in a consumer having weaker rights than if the law of his home country or the country in which he is habitually resident were applicable. The provision is intended to ensure that consumers in the European Union can always rely on the consumer protection rights they enjoy in their everyday lives in their country of habitual residence. In the area of token sales aimed at private purchasers, this would mean that private purchasers could potentially assert different consumer rights against the issuer in the context of a token sale, depending on their country of habitual residence. This result would be impractical and would run contra to the principle of equal treatment of all purchasers in a particular issue. In this context, the question arises as to whether the Rome I Regulation provides for an applicable exception for token sales to private purchasers.

        In the Case of Public Offers of Financial Instruments, the Consumer Protection Privilege of Article 6(2) of the Rome-I-Regulation Does Not Apply

        The question arises as to whether the exception in Article 6(4)(d) of the Rome-I-Regulation can also apply to crypto assets and contractual terms governing the issuance and public offering of crypto assets. According to the wording of the provision refers exclusively to financial instruments which, under EU law, are generally only instruments within the meaning of Article 4(15) of the EU Regulation on markets in financial instruments (MiFID2). The exception therefore applies in particular to transferable securities, units of investment funds, derivatives, or similar products. According to the alternative relationship between MiCAR and MiFID2 expressly provided for in Article 2(4)(a) MiCAR, crypto assets are not financial instruments. However, there are good arguments in favor of applying the exception in Art. 6(4) of the Rome-I-Regulation by analogy, as the interests of issuers and purchasers are comparable in the case of token sales of uniformly structured crypto assets. The European regulator appears to have simply overlooked the problem of consumer protection privileges for token sale participants with consumer status when MiCAR was introduced, so that this is likely to be an unintended regulatory gap. However, Article 6(4) provides for an exception for contracts for financial services. This makes sense because financial services only relate to a financial instrument and do not specify the specific rights and obligations arising from it. Since a comparison with the area of crypto assets can also be affirmed in this respect, the exception should also apply mutatis mutandis to downstream services – in this case, crypto asset service listings within the meaning of Article 3(1)(16) MiCAR.

        Attorney Dr. Lutz Auffenberg, LL.M. (London)

        I.  https://fin-law.de

        E. info@fin-law.de

        subscribe to Newsletter

        This Blog Article as Podcast?

          Contact

          info@fin-law.de

          Jun 02, 2025

          AI Compliance in Companies (Part II) – When Does an AI Model Fall Within the Scope of the GDPR?

          The General Data Protection Regulation (GDPR) was deliberately drafted in a technology-neutral manner, so it is not surprising that the long arm of the GDPR also extends deep into processes involving AI. This is somewhat inevitable, as the development of Large Language Models (LLMs) requires the processing of ever-larger data sets. The development of an AI system can undoubtedly involve a number of activities relevant to data protection law on the part of the controller, from the development phase to the deployment phase. At the heart of every AI system is the underlying AI model, the neural network developed using machine learning. This requires training data to be collected and processed, and the AI model must then be trained. The collection and preparation of data may constitute processing within the meaning of the GDPR if the training data is personal data. Anonymizing personal data prior to training also constitutes processing, which is why the GDPR must be observed. In the deployment phase, i.e., when the AI system is used, the processing of personal data is often also envisaged, which must also be reviewed from a data protection perspective. However, in addition to these more obvious forms of personal data processing, the question arises as to whether an AI model that has been trained with personal data itself contains personal data. In other words, whether the AI model itself can be subject to data subject rights under Art. 12 et seq. GDPR. In addition, supervisory authorities could order remedial measures to remedy the unlawfulness of the processing of personal data in the development phase of an AI model. These include fines, temporary restrictions, the deletion of unlawfully processed data sets (in whole or in part), or even the deletion of the AI model itself.

          Is an AI Model Anonymous or Do they Contain Personal Data?

          Whether an AI model itself is anonymous depends on whether the AI model contains personal data. According to Art. 4 No. 1 GDPR, personal data is any information relating to an identified or identifiable natural person. In contrast, the GDPR does not apply to anonymous data, i.e., data that does not relate to an identified or identifiable natural person, or personal data that has been anonymized in such a way that the data subject cannot be identified or is no longer identifiable. If an AI model has been trained (also) with personal data, the question arises as to what extent the AI model contains personal data as a result of this training. In this context, the Hamburg Commissioner for Data Protection and Freedom of Information stated in its discussion paper “Large Language Models and Personal Data” on the applicability of the General Data Protection Regulation (GDPR) to large language models that the mere storage of an LLM does not constitute processing within the meaning of Art. 4 No. 2 GDPR, as no personal data is stored in LLMs themselves. This is justified by the fact that LLMs work on the basis of tokens (linguistic fragments) and embeddings (mathematical representations of the relationships between tokens) and represent “highly abstracted and aggregated data points from the training data and their relationships to each other without concrete characteristics or references to natural persons.” In a recent statement “on certain data protection aspects of the processing of personal data in the context of AI models,” the EDPB has now refuted the Hamburg Data Protection Commissioner’s thesis. The EDPB clarifies that an AI model trained with personal data cannot be considered anonymous in all cases. The claimed anonymity must therefore be examined by the competent supervisory authorities on a case-by-case basis.

          How Will the Distinction Be Made?

          An AI model can only be considered anonymous if two cumulative conditions are met: The probability of direct (including probabilistic) extraction of personal data about the individuals whose data was used for training, and the probability that such personal data will be obtained through intentional or unintentional queries, must be negligible for each individual concerned. This is to be agreed with, as information may also refer to a natural person if it is encoded in such a way that the relationship is not immediately apparent. Although AI models do not usually contain direct records of personal data, but only parameters that represent probabilistic relationships between the data contained in the AI model, it is possible to derive information from the AI model. Under certain circumstances, statistically derived personal data can be extracted from the AI model. The probability assessment to be carried out should take into account all means likely to be used by the controller or another person acting in the exercise of their normal activities, including the unintended (re)use or disclosure of the AI model. According to the EDPD, the criteria for assessing the residual probability of identification should include the characteristics of the training data set (e.g., uniqueness of the data sets, accuracy), the methods used for training, and the implementation of technical and organizational measures to reduce identifiability (e.g., regularization methods, differential privacy). The results of structural tests that check resistance to attacks such as attribute and membership inference, exfiltration, or regurgitation of training data, the context in which the AI model is released and/or processed (e.g., public availability versus internal use), and additional information that could be available to another person for identification must also be taken into account. Controllers must document the measures taken to reduce the probability of identification and the possible remaining risks, not least because this documentation in particular is to be taken into account by the competent authorities in order to assess the anonymity of an AI model. If, after reviewing the documentation and the measures implemented, the competent authority cannot confirm anonymity, it can be assumed that the controller has not fulfilled its accountability obligations under Article 5(2) GDPR. Careful documentation is therefore strongly recommended.

          FIN LAW

          I.  https://fin-law.de

          E. info@fin-law.de

          subscribe to Newsletter

          This Blog Article as Podcast?

            Contact

            info@fin-law.de

            May 26, 2025

            Descoping MiCAR – Are NFTs the Last Stronghold of the Unregulated Crypto Market?

            For a long time, the crypto market fascinated many of its participants with its lack of regulation and oversight. While the unregulated environment did provide fertile ground for dubious and criminal activities, it also undoubtedly enabled numerous technical innovations in just a few years that would have been virtually impossible in a regulated environment. The regulatory framework is too restrictive for the implementation of new ideas and the use of technologies that are not yet established, meaning that market participants must carefully consider whether they are willing to accept the risks of entirely unpredictable reactions from regulatory authorities to innovative approaches or whether they would be better off relying on best practices and traditional business models. If the crypto market had been comprehensively regulated from the outset, today’s technical possibilities would hardly have been able to evolve to this point. In any case, with the introduction of MiCAR, the European crypto market has left the “Wild West” behind and is now comprehensively regulated. Of course, there are good reasons for regulation, especially since crypto assets are not infrequently used for money laundering and terrorist financing, and such practices must be stopped. However, innovation is extremely difficult in the new regulated market, as the rules and their interpretation by ESMA, EBA, and the competent authorities, such as BaFin in Germany, leave little room for interpretation. However, so-called non-fungible tokens (NFTs) are excluded from the scope of MiCAR pursuant to Art. 2 (3) MiCAR. Crypto assets that qualify as NFTs can therefore not be the subject of crypto asset services. Nor can their issuance on the market constitute a public offer that would trigger an obligation to prepare a crypto asset white paper. There is therefore still considerable scope for innovation in this area.

            What Precisely is an NFT in the Context of MiCAR?

            In order to fall under the exemption provided for in Article 2(3) MiCAR, a crypto asset must, according to the wording of the provision, first be unique and not fungible with other crypto assets. The text of the regulation itself does not impose any further requirements. However, recitals 10 and 11 preceding the text of the regulation provide further guidance as to which specific crypto assets the regulator intended to exclude from the scope of MiCAR as NFTs. This makes it clear that the key factor is the uniqueness of a crypto asset. If a crypto asset is not readily exchangeable for another crypto asset and its relative value cannot be determined by comparison with an existing market or an equivalent asset due to its uniqueness, it should be considered an NFT within the meaning of the exemption in Article 2(3) MiCAR. In recital 11, however, the regulator emphasizes that unique characteristics of crypto assets that, despite having unique characteristics, are ultimately part of a large series or collection should not be considered unique within the meaning of MiCAR. Simply numbered crypto assets that differ only in their serial number are therefore certainly not to be classified as NFTs eligible for exemption, to give an obvious example. Similarly, according to the intention of MiCAR, a crypto asset should not constitute an NFT if its de facto characteristics or de facto intended use make it an interchangeable and non-unique crypto asset, even if it appears to be unique at first glance. In this respect, the economic approach should be decisive.

            What are Examples of NFTs That Do Not Fall Under MiCAR and What Else Needs to Be Considered?

            In recitals 10 and 11, digital art and collectibles are explicitly mentioned as NFTs. Similarly, the regulator does mention non-fungible services such as product warranties and non-exchangeable assets such as real estate. However, the specific examples cited should not obscure the fact that the legal classification of a crypto asset must be based on the requirements set out above and that, therefore, a tokenized service such as car washing would not qualify as an NFT within the meaning of MiCAR if the car is driven by a different person depending on the token used. This would not be sufficient to ensure that the tokens are individually identifiable. Further examples of NFTs include crypto assets in areas of application such as voting at general meetings of stock corporations, product supply chain or identity management, or in access and authorization management. NFTs are likely to be particularly interesting in the future in the fight against AI-generated counterfeits, as they can be used to provide unique proof of authenticity. Even if crypto assets qualify as NFTs within the meaning of Art. 2 (3) MiCAR in individual cases, they may still be regulated on other legal grounds, for example as financial instruments under MiFID2 or as cryptographic instruments within the meaning of the German Banking Act (KWG). Start-ups with promising ideas in the NFT sector should definitely take advantage of the current legal situation while it lasts.

            Attorney Dr. Lutz Auffenberg LL.M. (London)

            I.  https://fin-law.de

            E. info@fin-law.de

            subscribe to Newsletter

            This Blog Article as Podcast?

              Contact

              info@fin-law.de

              May 19, 2025

              Descoping MiCAR – When Does the Scope Exemption for Financial Instruments Apply?

              Obligations under the EU Regulation on Markets in Crypto Assets (MiCAR) generally only arise for market participants if they carry out or plan to carry out activities involving or related to crypto assets. The regulator has defined crypto assets very broadly in MiCAR. According to this definition, crypto assets are essentially any digital representation of value or right that can be transferred and stored electronically using distributed ledger technology or similar technology. However, the provisions of MiCAR are not always applicable once the criteria of the definition of crypto assets are met. This is because Art. 2 MiCAR provides for a number of exceptions in this respect. One of the most relevant exceptions is provided for in Art. 2(4)(a) MiCAR for crypto assets that qualify as financial instruments within the meaning of MiFID2, i.e., the EU Directive on Markets in Financial Instruments. If the exception applies because a crypto asset is classified as a financial instrument within the meaning of MiFID2, MiCAR does not apply in any part, according to the clear wording of Article 2(4) MiCAR. But under what exact circumstances is a crypto asset to be regarded as a financial instrument?

              ESMA Publishes Guidelines on the Distinction Between Crypto Assets and Financial Instruments

              On 17 December, 2024, ESMA published the final version of its guidelines on the conditions and criteria for classifying crypto assets as financial instruments, the draft of which it had previously submitted to market participants for consultation. ESMA was required to publish these guidelines pursuant to Article 2(5) MiCAR. Essentially, ESMA instructs Member States, competent authorities, and market participants to apply the “substance over form” principle when assessing whether a crypto asset qualifies as a financial instrument. Accordingly, the legal classification of a crypto asset is not determined by the form in which it appears on the market, its name or the promises associated with it, but rather by its inherent characteristics. The legal assessment should also be based on a technology-neutral approach to ensure that crypto assets of any technical design can fall under the MiCAR definition of crypto assets, regardless of any future technical developments. ESMA considers it problematic that the term “financial instrument” is not defined uniformly for the entire EU, but that there is only a list of examples of financial instruments in the form of an EU directive implemented by the member states, leaving member states the option of creating additional categories of financial instruments at the national level. Examples under German law include, in particular, units of account and, for example, investments under the German Assets Investment Act (Vermögensanlagengesetz). It is precisely such national legal forms of financial instruments, but also hybrid crypto assets that combine several functions, that make it difficult in individual cases to answer the question of whether they – if tokenized – fall under MiCAR as crypto assets or under MiFID regulation as financial instruments.

              When Does ESMA Consider a Crypto Asset to Be a Financial Instrument?

              Annex I Section C to MiFID2 contains a list of instruments that qualify as financial instruments within the meaning of the Directive. According to this, transferable securities in particular, but also money market instruments, units in investment funds, options, derivatives and CFDs, as well as emissions certificates, are financial instruments within the meaning of MiFID2. Essentially, ESMA expects the competent supervisory authorities and other legal practitioners, in line with the “substance over form” approach, to check whether the instrument in question is tradable on the capital markets when assessing whether a crypto asset qualifies as a transferable security, when it does not constitute a payment instrument and is also fungible and embodies rights similar to those of securities, as is usually the case with shares, debt instruments, and other securities. Tradability is given if the crypto asset can be freely traded and transferred. According to the ESMA guidelines, crypto assets that are used as a means of exchange are considered payment instruments. Voting rights associated with crypto assets should only be considered similar to securities if they enable genuine participation in corporate decisions (e.g., election of the board of directors, approval of transactions, etc.), but not if they merely entitle the holder to participate in votes on technical aspects, protocol upgrades, or fee adjustments. For hybrid crypto assets that fulfill the characteristics of a financial instrument but also have other functions, such as an access authorization function, the financial instrument characteristic of the instrument in question should take precedence in accordance with the ESMA guidelines.

              Attorney Dr. Lutz Auffenberg, LL.M. (London)

              I.  https://fin-law.de

              E. info@fin-law.de

              subscribe to Newsletter

              This Blog Article as Podcast?

                Contact

                info@fin-law.de

                May 12, 2025

                AI Compliance in Companies (Part I) – Why and How Should Employees Be Trained?

                The use of AI in companies has the potential to save costs and increase sales through automation. At least, that is what many companies are currently hoping for from the new hype surrounding deep learning and generative AI that has emerged in recent years, triggered by the success of ChatGPT. Developments in this area have been progressing rapidly ever since, with large US and Chinese tech companies in particular seeming to be engaged in a fierce competition to outdo each other on an almost weekly basis. In doing so, they are improving their models with new features, greater accuracy, and increased efficiency, for example. In addition to the major players, a wide range of other service providers and specialized tools for various areas of application have already established themselves. This rapid development makes it difficult to keep track of which solution might be suitable for your own company. Companies must first ask themselves what AI can actually do and where its technical limits lie. Then they need to clarify whether and how AI can be used profitably. Key decisions concern whether the project should be implemented in-house or outsourced. Should AI be developed or adapted in-house (fine-tuning), or should third-party applications be used, such as “AI as a Service” (AIaaS)? Should data be processed on your own servers, or is a cloud solution preferred? A strategic approach and sufficient AI expertise are essential to answer all these questions. Management is responsible for setting the course for the success of AI projects. This includes ensuring that employees have the necessary expertise in dealing with AI systems. The employees involved must therefore be trained accordingly. However, this is no longer just a business necessity, but also a direct result of legal requirements. The EU AI Act refers to AI literacy in this context. But what exactly does AI literacy mean, and how far do the legal training requirements extend?

                Requirement for AI Literacy

                The AI Act defines AI literacy as the skills, knowledge, and understanding that enable providers, operators, and affected parties, taking into account their respective rights and obligations under this Regulation, to use AI systems competently and to be aware of the opportunities, risks, and potential harm that AI can cause. According to Article 4 of the AI Act, providers and operators of AI systems are required to take measures to ensure, to the best of their ability, that their personnel and other persons involved in the operation and use of AI systems on their behalf have a sufficient level of AI literacy. The technical knowledge, experience, education, and training of employees must be taken into account, as well as the specific context in which the AI systems are to be used and the target groups for which the systems are intended. Unlike most other obligations under the AI Act, the training obligation applies regardless of how the underlying AI systems are classified in the various risk categories of the regulation. Companies are therefore obliged to ensure that their employees have a sufficient level of AI literacy. The obligation to ensure AI literacy must be fulfilled to the best of the company’s ability, in an individualized and context-specific manner. In concrete terms, this means that companies must tailor training to the technical knowledge, experience, and educational level of their employees. At the same time, the specific context in which the AI systems are used must be taken into account. However, it remains unclear how these requirements are to be implemented in individual cases. There is currently no simple set of rules or checklist that companies can use in this context. Instead, companies must develop their own appropriate measures and training concepts to meet the requirements of the regulation.

                Who Requires Training and What Are the Benefits of an AI Officer?

                There is currently no established best practice for ensuring AI literacy. To counteract this and promote exchange between companies, the European AI Office has published the Living Repository of AI Literacy Practices, which presents the practices implemented by participating companies to promote AI literacy. This can provide valuable information for your own implementation. Based on this, the following steps are necessary: determining the target group and analyzing needs, considering the application context, selecting and implementing training approaches, measuring and evaluating the impact (KPIs), and dealing with challenges and continuous improvement. The first step should therefore be to lay the foundations, which includes formulating AI guidelines and, building on these, guidelines for employees on how to deal with AI. Governance structures should be established for this purpose. One possible measure could be the introduction of an AI function (AI officer) to manage skills development. When it comes to training, it makes sense to distinguish between basic training for all employees and target group-oriented training for specific areas of responsibility. The basic training can provide all employees with a fundamental understanding of how AI systems work and the ethical and legal challenges they pose. In more advanced, subject-specific training courses, the knowledge required for the respective area of responsibility can be imparted and the basic knowledge deepened. Depending on requirements, the topics to be covered could include the following in particular: General basics on AI, technical fundamentals, areas of application and limitations, security and risk management, legal framework and compliance requirements (especially AI Regulation and GDPR), as well as ethical and social aspects. The training process should be standardized to ensure a consistent level of competence throughout the company. The training should be documented and reviewed for effectiveness. Training can also be provided by external service providers. However, management should ensure that the training covers content relevant to the company’s own situation.

                FIN LAW

                I.  https://fin-law.de

                E. info@fin-law.de

                subscribe to Newsletter

                This Blog Article as Podcast?

                  Contact

                  info@fin-law.de

                  Apr 30, 2025

                  The Future of Programming: Vibecoding with Artificial Intelligence

                  Generative language models (AI) have ushered in a remarkable transformation in software development in recent years. Tools such as GitHub Copilot, Cursor, Gemini Code Assist, and all-rounders such as ChatGPT, Claude, and Le Chat impressively demonstrate that AI has long been capable of generating complex code from simple text inputs. These models, trained on huge amounts of publicly available code, now offer support for reviewing, adapting, or even completely developing software. A new phenomenon is particularly noticeable here: vibecoding. This approach relies almost entirely on AI-assisted code generation, opening the doors to the world of programming even for people without in-depth technical knowledge. Pandora’s box has been opened, and where years of learning were once required, a well-formulated prompt is often all that is needed today – AI does the rest. But despite all the fascination with this new ease of software development, the risks must not be ignored. The danger of using AI-generated code without really understanding it is real. Bugs, security vulnerabilities, or outdated practices can creep into projects unnoticed, which can have serious consequences, especially in professional environments where maintainability, scalability, and security are essential. Legal issues also arise, for example in relation to copyright. In particular, the question arises as to who owns the code generated by AI and how the risks can be mitigated.

                  Copyright Pitfalls When Using AI in Software Development

                  The use of generative AI in programming raises complex copyright issues that developers and companies should not ignore. On the one hand, the training of many AI models is based on large amounts of data, which often also contain copyright-protected material. On the other hand, the question arises as to whether and to what extent the code generated by AI itself is protected by copyright – and who, if anyone, can be considered the author. Under German copyright law (Sections 69a et seq. UrhG), computer programs and individual program components, such as code snippets, are generally protectable if they are the result of an original intellectual creation. This protection applies to the specific form of expression of the code, while underlying ideas, algorithms, or interface concepts are explicitly excluded from copyright protection. A crucial problem with purely AI-generated code is that, since it lacks human creativity, such products do not enjoy copyright protection. This means that software created entirely by an AI coding assistant is generally considered “unprotected” in terms of copyright law (public domain). The situation may be different if humans play a significant role in the creative process, for example by providing precise specifications and exerting targeted influence. In such cases, the term “AI-assisted work” is used, and copyright protection may well apply. Particular difficulties arise when AI coding tools draw on public code libraries or packages. Several dangers lurk here: In addition to the risk of unnoticed malicious code being integrated, there is the possibility of adopting copyright-protected code that is subject to special open source licenses. Anyone who incorporates code fragments into their software without checking the respective license terms could quickly violate license requirements and be liable for damages. License models such as the GPL in particular stipulate far-reaching obligations that can significantly restrict commercial use.

                  How Companies Can Protect Themselves from the Risks of AI-Generated Software

                  Companies that use AI-supported coding tools or purchase software from external service providers should address the associated legal and technical risks early on and in detail. Conscious risk management is essential to avoid liability traps, security gaps, and license violations. If a company uses external service providers, it is important to include appropriate provisions in the contract. In particular, the labeling of AI-generated products and the transparent handling of third-party sources and licenses used should be contractually guaranteed. When acquiring licenses for third-party works, it must also be clearly defined whether and to what extent AI was used in the creation of the software. For companies that carry out software development internally, the new EU Regulation on Artificial Intelligence (AI Regulation) is coming into focus. While most of the provisions of the AI Regulation will not apply until August 2, 2026, Chapters I and II have already been in effect since February 2, 2025. Article 4 of the AI Regulation in particular obliges companies to provide adequate training for their personnel involved in the operation and use of AI systems. Employees must have a sufficient level of AI competence to be able to recognize and manage risks, for example when using coding assistants. Companies should therefore ensure that their developers are not only technically proficient, but also aware of legal pitfalls, security aspects, and licensing issues when dealing with AI-generated code. In addition, it is advisable to introduce a company-wide AI policy. This should contain clear guidelines on the use of AI tools, the testing of generated code, and the handling of open source licenses and third-party libraries. Such guidelines help to establish uniform standards and minimize the risk of wrong decisions at the operational level. Transparent handling of AI products and proactive engagement with new regulatory requirements are key to leveraging the benefits of AI in coding in a safe and legally compliant manner.

                  FIN LAW

                  I.  https://fin-law.de

                  E. info@fin-law.de

                  subscribe to Newsletter

                  This Blog Article as Podcast?

                    Contact

                    info@fin-law.de

                    Apr 22, 2025

                    MiCAR Transition – What are BaFin’s Obligations towards CASPs under MiCAR Grandfathering?

                    The regulations on the authorization requirement and compliance with specific compliance obligations for crypto asset service providers (CASP) have been in force since December 30, 2024. Companies that were already providing services in an EU or EEA member state prior to this date in accordance with the law applicable at that time, which are now classified as crypto asset services within the meaning of MiCAR, may continue to offer these services to their customers for the time being, even if they have not yet obtained MiCAR authorization from the competent authority. In this respect, the new regulation provides for a transitional arrangement in Article 143(3) MiCAR, also known as grandfathering. Article 143(3) MiCAR provides that the continuation of legal activities prior to December 30, 2024, without the required MiCAR authorization is possible until July 1, 2026, or until the date on which the competent national supervisory authority has made a positive or negative decision on the company’s MiCAR authorization application. Under the MiCAR grandfathering rule, it is not necessary for the company to actually submit the license application at a specific point in time. Nor does the transitional provision specify whether the application for a MiCAR license must have a certain scope, in particular whether it must cover all transactions continued under the grandfathering provision. However, member states alone have the option of limiting the period of grandfathering for their jurisdiction. They are not entitled to restrict the scope of Article 143(3) MiCAR or the specific crypto services eligible for grandfathering.

                    Dangers Await CASPs with MiCAR Passporting

                    Shortly before the transitional provision on MiCAR grandfathering came into force, the European Securities and Markets Authority (ESMA) published an opinion in December 2024 on the handling of the MiCAR transitional rule for CASPs. Essentially, ESMA warned both CASPs and the national supervisory authorities responsible for them of problems with the grandfathering regime in the case of companies offering crypto asset rservices in several EU member states. Problems could arise insofar as individual member states have made very different use of their option to impose stricter time limits on grandfathering than the maximum provided for in MiCAR. While, for example, a maximum grandfathering period of 12 months applies in Germany and Austria, the Netherlands, Poland, and Finland allow only six months. France, Denmark, and the Czech Republic, on the other hand, grant CASPs that make use of grandfathering up to 18 months to fully implement the MiCAR requirements and obtain the necessary authorization. These differences pose risks for CASPs operating in countries with both 12-month and six-month periods if, for example, they wish to use their license obtained in Germany for business in the Netherlands by way of passporting, but BaFin does not grant the required MiCAR license until after the six-month transition period applicable in the Netherlands has expired. In this scenario, the crypto asset service provider in question would face the threat of having to cease its business in the Netherlands, as it would no longer be able to invoke Article 143(3) MiCAR after six months.

                    BaFin is Required to Monitor the Overall Situation Regarding the Processing of License Applications

                    The ESMA recommendations are addressed to both CASPs and the supervisory authorities responsible for processing MiCAR authorization applications. The crypto asset service providers are strongly advised to submit their MiCAR authorization applications as soon as possible to enable the authorities to process them within the grandfathering period. In addition, CASPs are urged to identify any problems arising from the different durations of the transitional provisions in the individual member states as quickly as possible so that, if necessary, applications for the required MiCAR license can also be submitted in member states with short transition periods. However, ESMA also expects the competent supervisory authorities to act proactively. They should engage in a detailed exchange with applicants at an early stage in order to be informed about business conducted in other member states. In this context, BaFin will have to consult with the supervisory authorities of the other member states as early as possible and on an ongoing basis in order to prevent avoidable disruptions due to a lack of authorizations after the expiry of the grandfathering periods. BaFin will also have to prioritize applications from affected CASPs.

                    Attorney Dr. Lutz Auffenberg, LL.M. (London)

                    I.  https://fin-law.de

                    E. info@fin-law.de

                    subscribe to Newsletter

                    This Blog Article as Podcast?

                      Contact

                      info@fin-law.de

                      Apr 07, 2025

                      Offering AI Investment Tools: A Regulated Activity?

                      Artificial intelligence (AI) or, more specifically, large language models (LLM) are no longer just on the rise, but have already arrived in many areas of business and private life. The use of chatbots and other AI applications is increasingly becoming part of everyday life. A chatbot can now answer questions that previously required extensive Google searches and visits to many different websites, including maneuvering annoying cookie banners and often a flood of unwanted advertising, in just a few seconds. The added value of web searches by chatbots for users cannot be denied. It is therefore no surprise that trust in and the desire for AI-supported tools is increasing in more and more areas of life. One possible use of AI/LLMs may be to use them to assist in investment decisions. Recently, ESMA also felt compelled to publish a warning about the risks of using AI in financial investments. This was also published on the BaFin website on March 28, 2025. According to BaFin, consumers should be particularly careful when buy and/or sell signals are artificially generated. AI tools and apps could provide tips or recommendations that may be inaccurate or misleading. Those who invest based on them risk significant financial losses. AI tools and apps are neither authorized nor supervised by financial regulators. This notice once again provides reason to examine the previous classification of providers of automated software-based investment services and, in this context, to examine under which conditions providers of dedicated AI investment tools require a license from BaFin.

                      The So-Called Robo-Advice

                      BaFin has been dealing with the topic of automated distribution of financial instruments and similar digital offers for quite some time. In an article from the 2017 annual report, BaFin already summarizes these under the term robo-advice and states that such advice generally meets the legal definition of investment advice or financial portfolio management and therefore requires a license under banking or commercial law. A later article from 2020, BaFin reiterated its view that robo-advice can be legally classified as investment advice, financial portfolio management, acquisition brokerage or investment brokerage in an article entitled “Robo-Advice – Automated Investment Advice and Financial Portfolio Management”. In its 2022 information notice “Automatisierte und signalbezogene Beratungs- und Handelssysteme” (Automated and Signal-Based Advisory and Trading Systems), BaFin once again emphasized that a conclusive regulatory assessment is only possible if BaFin is provided with the contractual agreements between the provider and its customers in individual cases. Liability for robo-advice has also been addressed in case law. In a judgment dated May 30, 2018 – 12 U 95/16 – the Higher Regional Court of Hamm ruled that in the case of automated online trading in financial products, proprietary trading (which does not require a license) is deemed to have taken place by the person “who decides on the fundamental settings and specifications of the software”. The court stated that the decisive factor is not who actually makes the settings or where the software is installed (on the customer’s hardware or in the cloud). The court regards the main criterion as being who made the “decisive specifications” in the relationship between the parties. In the legal literature, it has been argued, among other things, that in the case of software with abstractly defined trading algorithms, the software provider has no discretion. The user is responsible for the use of the software. The decisive factor is which contractual partner can ultimately decide on its use or non-use (usually the user). Therefore, automated portfolio management should at least not be subject to authorization as financial portfolio management.

                      What are the Arguments For and Against Requiring Permission for Providers of AI Investment Tools?

                      First of all, it must be noted that the judgment of the Higher Regional Court of Hamm cannot be applied across the board to all robo-advisers and AI investment tools, since it is based on a case in which the investor himself actually provided essential specifications for the software. Furthermore, investors require the same level of protection with AI systems as they would with advice or management from a human. The mere power of disposition of the investor (activation/deactivation) does not change the lack of predictability of the AI decisions. LLMs are characterized precisely by the fact that they do not merely follow predefined algorithms. Without predictability for the investor, an investment decision should not be attributable to the investor. If an investor uses an ordinary AI chatbot and asks it for help with investment decisions, it is unlikely that the provider of this chatbot can be said to be performing an activity that requires a license. The situation could be different if AI-supported software is explicitly offered that automatically manages the investor’s portfolio and makes buy and sell decisions for the investor. Providers of AI investment tools should therefore check in each individual case whether their own application includes activities that require a license. If necessary, the business model should be adapted to avoid authorization requirements or to obtain a license. Consideration could also be given to cooperating with market participants who already have the necessary authorizations. After analyzing one’s own business model, an inquiry should first be made to BaFin to clarify one’s own intentions before the AI investment tool is offered to investors in Germany.

                      FIN LAW

                      I.  https://fin-law.de

                      E. info@fin-law.de

                      The lawyer responsible for questions relating to AI Investment Tools, Robo-Adviser and IT law at our law firm is Attorney Lutz Auffenberg LL.M. (London).

                      subscribe to Newsletter

                      This Blog Article as Podcast?

                      The Gist of It:

                      Presentation

                        Contact

                        info@fin-law.de

                        Mar 31, 2025

                        Issuance of Stablecoins with a Value of up to EUR 5 Million – What Advantages Does MiCAR Offer Small ART Issuers?

                        So-called Asset-Referenced Tokens (ART) have been strictly regulated under the Markets in Crypto Assets Regulation (MiCAR) since the summer of 2024. According to the MiCAR definition, ARTs are a special form of crypto-assets that attempt to maintain value stability by referencing one or more other assets, without being classified as E-Money Tokens (EMT). According to the new supervisory regime for crypto-assets in the EU, in principle, initially only credit institutions and issuers specifically authorized for the issuance of ARTs are permitted to issue Asset Referenced Tokens and offer them to the public. However, MiCAR allows for an exception to this principle for micro-issuances if the equivalent value of the ART issued by the respective issuer has not exceeded the threshold of EUR 5 million over a period of twelve months. The average outstanding value is to be calculated at the end of each calendar day. If these conditions are met, the issuer of the Asset Referenced Tokens does not require a MiCAR license and does not subsequently have to apply for admission to the competent authority – in Germany, the BaFin. However, this does not eliminate all the other requirements for ART issuers that the MiCAR imposes.

                        The Obligation to Prepare a Crypto-Assets White Paper Also Applies to ART Issuers under the 5-Million Exception

                        One of the key obligations of issuers of crypto-assets under MiCAR is the requirement to prepare and publish a crypto-assets white paper. ART issuers, in particular, are required to prepare a crypto-assets white paper for the stablecoins they issue. The MiCAR specifies in great detail the content that must be included in the document. Under the exemption for ART issuances below the 5 million threshold, only the requirement to obtain MiCAR authorization as an issuer of Asset Referenced Tokens or to be a credit institution is waived. However, the text of the regulation explicitly requires the preparation of an ART white paper even in cases where the exemption is applied. The exemptions for issues of other crypto-assets – such as offers to no more than 150 investors per member state or free offers of crypto-assets – are generally not applicable to issues of ART. ART issuers therefore cannot get around the obligation to create a white paper, even if they always remain below the equivalent of EUR 5 million with the ART they issue.

                        BaFin Does Not Have to Authorize ART White Papers under the 5-Million Exception

                        MiCAR requires that white papers for ART crypto-assets must be explicitly authorized by the competent authority. This significant difference compared to the white paper to be prepared for other crypto-assets can be explained by the fact that the regulatory requirements for ART issuers are significantly more extensive than those for issuers of other crypto-assets that do not qualify as ART or EMT. However, the requirement to publish a white paper does not apply to issuers operating under the exemption for ART issuances of less than 5 million. This regulation causes problems in that it creates legal uncertainty with regard to the question of exactly how micro-issuers of ART must publish their white paper. This is because the MiCAR regulation on the disclosure requirement on the issuer’s website refers, according to its wording, exclusively to crypto-assets white papers for Asset-Referenced Tokens that have been authorized. The Central Bank of Ireland therefore asked ESMA for clarification on this issue in August 2024. However, ESMA’s response is still pending and is still being reviewed by the EU Commission. However, issuers of ART issues below the equivalent of EUR 5 million will be well advised to also publish the white paper on their own website in any case and to keep it available there from the start of the public offering and only remove it when no third party no longer holds any of the issued ART.

                        Attorney Dr. Lutz Auffenberg, LL.M. (London)

                        I.  https://fin-law.de

                        E. info@fin-law.de

                        The lawyer responsible for regulatory questions relating to the authorization as an issuer of asset referenced tokens and for the related exemptions at our law firm is Attorney Lutz Auffenberg, LL.M. (London).

                        subscribe to Newsletter

                          Contact

                          info@fin-law.de

                          to top