Introduction
Digital identity verification has become one of the most important legal and technological issues in Turkish fintech. Banks, payment institutions, electronic money institutions, digital lending platforms, crypto asset service providers, investment apps, InsurTech platforms, open banking providers, and Banking-as-a-Service interfaces all need reliable methods to identify customers remotely. A customer may open an account, create a wallet, apply for a loan, trade crypto assets, access investment services, or conclude a financial contract through a mobile application without physically visiting a branch.
Remote onboarding offers major advantages. It reduces friction, expands access to financial services, supports digital banking, improves customer experience, and lowers operational cost. However, it also creates serious legal risks. If identity verification is weak, fraudsters may open accounts with stolen documents, use synthetic identities, launder criminal proceeds, take over accounts, or move funds through mule accounts. If identity verification is excessive, the fintech company may unlawfully process biometric data, retain sensitive images longer than necessary, transfer data abroad without legal safeguards, or expose users to privacy harm.
In Turkey, remote identity verification is regulated through several overlapping legal regimes. Banks are subject to the BRSA Regulation on Remote Identification Methods to be Used by Banks and Establishment of Contractual Relationship in Electronic Environment, which regulates remote identification for acquiring new clients and establishing contractual relationships through electronic communication tools. The regulation expressly states that remote identification must be applied without prejudice to obligations under Law No. 5549 on prevention of laundering proceeds of crime and related legislation.
Fintech companies must also comply with the Turkish Personal Data Protection Law No. 6698, known as KVKK. KVKK aims to protect fundamental rights and freedoms, particularly privacy, in relation to personal data processing, and applies to natural and legal persons processing personal data through automated or non-automated filing systems.
This article explains digital identity verification in Turkish fintech, focusing on remote onboarding, video identification, NFC identity document checks, biometric data, explicit consent, KVKK risks, AML/KYC obligations, cybersecurity, outsourcing, customer rights, and platform liability.
1. What Is Digital Identity Verification?
Digital identity verification is the process of confirming that a person using a digital financial service is who they claim to be. In fintech, this usually occurs during onboarding, account opening, wallet creation, loan application, crypto platform registration, investment account opening, or high-risk transaction approval.
Digital identity verification may include:
Identity document capture
NFC chip reading
Video call verification
Liveness detection
Face matching
Biometric authentication
SMS OTP confirmation
Device verification
IP and geolocation checks
Sanctions and PEP screening
Identity Sharing System checks
Fraud database checks
Document authenticity analysis
Behavioral risk scoring
Remote contract approval
Digital identity verification should not be confused with ordinary login authentication. Login authentication asks whether the same user is returning to an existing account. Identity verification asks whether the person is legally and factually the real individual whose identity is being used.
In regulated fintech, identity verification is not optional. It is central to AML/KYC compliance, fraud prevention, customer protection, contract validity, data security, and regulatory trust.
2. Remote Onboarding in Turkish Financial Services
Remote onboarding means establishing a customer relationship without physical presence. The customer completes the onboarding process through mobile or web channels, often through identity document verification, video call, NFC reading, biometric matching, and electronic contract approval.
The BRSA banking remote identification regulation states that its purpose is to set procedures and principles for remote identification methods that banks may use for acquiring new clients and establishing contractual relationships through information technology or electronic communication devices in a manner substituting written form.
A similar approach exists for financial leasing, factoring, financing, and savings finance companies. The relevant BRSA regulation states that its purpose is to regulate remote identification methods for acquiring new customers and establishing contractual relationships through information or electronic communication devices, and it expressly preserves obligations under Law No. 5549 and KVKK.
For fintech, this shows a broader regulatory trend: remote onboarding is accepted, but it must be controlled, documented, secure, and compatible with AML and personal data protection duties.
3. Why Remote Identity Verification Is Legally Sensitive
Remote identity verification is legally sensitive because it sits at the intersection of financial crime prevention and privacy protection. A fintech company must verify the customer strongly enough to prevent fraud and money laundering, but it must not process more personal data than necessary.
The main legal risks include:
Wrong identification
Identity theft
Fraudulent account opening
Use of stolen ID documents
Deepfake or presentation attack
Account takeover
Unlawful processing of biometric data
Insufficient explicit consent
Excessive data collection
Unclear privacy notices
Long retention of video recordings
Insecure storage of ID images
Cross-border transfer of biometric data
Vendor misuse of onboarding data
Failure to respond to data subject requests
Data breach
Regulatory sanctions
Customer compensation claims
The key compliance challenge is proportionality. Fintech companies must prove that the identity verification method is necessary, lawful, secure, and proportionate to the financial service offered.
4. Remote Identification Process for Banks
The BRSA banking regulation contains detailed requirements for remote identification. Before implementing a remote identification process, banks must prepare process documents, test effectiveness, document test results, update failed processes, and avoid implementation until effectiveness and adequacy are ensured. The process must also be reviewed at least twice a year and additionally reviewed after security breaches, fraud attempts, legislative changes, or newly discovered vulnerabilities.
The video call stage must be conducted by a trained client representative. The representative must be trained on identity documents, verification methods, fraud or forgery indicators, relevant legislation, and personal data protection. The representative must also determine whether the individual voluntarily requests to become a bank customer or benefit from banking services.
These rules show that remote onboarding is not merely “upload your ID and take a selfie.” It is a controlled regulated process. Banks must combine technology, trained personnel, fraud controls, documentation, and periodic review.
Fintech companies working with banks in BaaS or white-label models should align their onboarding flows with these expectations where banking services are involved.
5. Video Call Verification
Video call verification is one of the main remote identification methods in Turkish banking regulation. The process must be real-time and uninterrupted. The integrity and confidentiality of audiovisual communication must be ensured, and the regulation requires secure end-to-end communication for the video call.
The regulation also requires sufficient video and audio quality throughout the call to avoid doubt or limitation in identification. The video quality must allow visual verification of the identity document under white light and examination of security elements to ensure that the document is not worn out or tampered with.
For fintech platforms, video verification creates practical legal questions:
Who conducts the call?
Is the representative trained?
Is the call recorded?
How long is the recording retained?
Is biometric data extracted?
Is the video processed by an AI vendor?
Is data stored in Turkey or abroad?
Is the user informed properly?
Can the user access or request deletion of data?
What happens if the video call quality is poor?
Video verification can be strong evidence, but it is also sensitive personal data processing. Therefore, the process must be designed with privacy and cybersecurity from the beginning.
6. NFC Identity Document Verification
NFC identity document verification is an important safeguard in remote onboarding. Under the BRSA banking regulation, verification of identity information embedded in the contactless chip of the identity document by NFC confirms the match required to identify the person based on the document. The process checks that the document was issued by the authorized authority, that chip information was not altered, and that chip keys were not duplicated or cloned.
If NFC verification cannot be performed, the regulation requires verification of at least four visual security features and also requires the first financial transaction before establishing a continuous business relationship to be made from the person’s account at another bank applying customer identification principles.
This is highly relevant for fintech. NFC-based verification is generally stronger than manual document upload because it reduces forged-document risk. However, NFC verification also raises data protection issues because chip data, ID images, and matching results are personal data. The fintech company must determine exactly what is read, stored, logged, shared, and deleted.
7. SMS OTP and Mobile Number Confirmation
The BRSA banking regulation requires a centrally generated SMS OTP valid only for the identification transaction to be sent to the individual during remote identification. The system must ensure that the SMS OTP is returned through the online application interface, and successful verification confirms the individual’s mobile phone number.
OTP verification helps connect the applicant to a phone number, but it is not sufficient by itself. SIM swap fraud, stolen phones, malware, and social engineering can compromise SMS-based controls. Fintech companies should therefore use OTP as one layer within a broader risk-based process, not as the sole identity verification measure.
8. Biometric Data in Remote Onboarding
Biometric data is one of the most sensitive parts of digital identity verification. Turkish law treats biometric data as a special category of personal data. KVKK Article 6 expressly lists biometric and genetic data among special categories of personal data.
The KVKK Authority’s public explanation similarly states that biometric and genetic data are special categories of personal data, and that special categories deserve specific protection because misuse can expose data subjects to discrimination or unfair treatment.
In remote onboarding, biometric processing may occur through:
Face matching
Liveness detection
Facial geometry analysis
Voice analysis
Fingerprint verification
Behavioral biometrics
Device interaction patterns
Anti-spoofing checks
AI-based video analysis
A company should not assume that a selfie is ordinary data. If the system uses technical processing to identify or verify a person uniquely, biometric data risk may arise.
9. Biometric Data and Explicit Consent
Under the BRSA banking remote identification regulation, only the person’s biometric data, as special category personal data, may be used for identification purposes in the remote identification process, and the individual’s explicit consent for this must be recorded electronically.
KVKK defines explicit consent as freely given, specific, and informed consent. KVKK Article 6, as amended in 2024, states that processing special categories of personal data is prohibited unless one of the listed conditions applies, including explicit consent or another condition expressly set out in the law. It also requires adequate measures determined by the Board when processing special categories.
For fintech companies, explicit consent must be real. It should not be hidden inside general terms. It should not be bundled with unrelated marketing consent. It should clearly explain:
What biometric data is processed
Why it is processed
Whether it is mandatory for remote onboarding
Whether there is an alternative method
Who processes it
Whether vendors are involved
Whether it is transferred abroad
How long it is retained
How it is secured
What rights the customer has
If biometric processing is not necessary or if a less intrusive alternative exists, the company may face proportionality problems even where consent is obtained.
10. KVKK General Principles
KVKK Article 4 requires personal data processing to comply with core principles: lawfulness and fairness, accuracy and being kept up to date where necessary, processing for specified, explicit and legitimate purposes, relevance, limitation and proportionality, and storage only for the period required by legislation or processing purpose.
These principles are crucial for digital identity verification. A fintech company should not collect every possible document, selfie, video, voice sample, location signal, device fingerprint, and behavioral data merely because the technology allows it. The company must justify each data category.
For example:
ID image may be necessary for verification.
NFC chip data may be necessary for authenticity.
Face image may be necessary for matching.
Long-term storage of raw biometric templates may not be necessary.
Location data may not be necessary for every onboarding process.
Marketing use of onboarding data is generally a separate purpose.
The principle of proportionality should guide the entire onboarding design.
11. KVKK Obligation to Inform
KVKK Article 10 requires the data controller to inform data subjects at the time personal data is obtained about the identity of the data controller, processing purposes, transfer recipients and purposes, method and legal basis of collection, and the rights under Article 11.
In fintech onboarding, the privacy notice should be shown before data collection begins, not after the customer has already uploaded an ID document and completed a face scan.
A proper onboarding privacy notice should explain:
Identity of the fintech company
Whether the company is a bank, payment institution, e-money institution, crypto platform, or service provider
Purpose of identity verification
Legal basis for ordinary personal data
Legal basis for biometric data
Whether explicit consent is requested
Recipients of the data
Whether cloud or KYC vendors are used
Retention period
Cross-border transfers
Data subject rights
Complaint mechanism
A vague statement such as “we process your data for service purposes” is not enough for a high-risk remote onboarding flow.
12. Data Subject Rights
KVKK Article 11 grants data subjects rights such as learning whether their data is processed, requesting information, learning the processing purpose, knowing domestic or foreign recipients, requesting rectification, requesting erasure or destruction under Article 7, objecting to results against them arising from analysis solely through automated systems, and claiming compensation for damage caused by unlawful processing.
These rights matter in fintech identity verification. A rejected applicant may ask why onboarding failed. A customer may request deletion of onboarding recordings after account closure. A user may object to an automated identity verification failure. A person whose identity was misused may request information and correction.
Fintech companies should build customer rights workflows into onboarding systems. Data should be searchable, exportable, correctable, and deletable where legally required. If the company cannot locate onboarding records, it cannot properly comply with KVKK requests.
13. Data Security Obligations
KVKK Article 12 requires the data controller to take all necessary technical and organizational measures to provide an appropriate level of security to prevent unlawful processing, prevent unlawful access, and ensure protection of personal data. If data is processed by another person on behalf of the controller, the controller is jointly responsible with those persons for taking these measures.
This is critical where fintech companies use third-party identity verification vendors, cloud providers, liveness detection tools, OCR systems, AML screening platforms, or outsourced call centers. The fintech company cannot simply say that the vendor caused the breach. It must select vendors carefully, impose contractual security obligations, audit performance, and monitor incidents.
Onboarding data should be protected with:
Encryption
Access control
Role-based permissions
Logging
Data minimization
Secure deletion
Segregated storage
Vendor due diligence
Penetration testing
Incident response
Retention limits
Employee confidentiality
Secure API integration
Anti-fraud monitoring
Regular access reviews
Identity documents and biometric templates are high-value data. A breach may expose customers to fraud for years.
14. KVKK Biometric Data Guidance
The KVKK Authority’s biometric data guidance explains that biometric data refers to personal data resulting from specific technical processing relating to physical, physiological or behavioral characteristics that allow or confirm unique identification, such as facial images or fingerprint data. It also notes that biometric data includes physiological biometric data such as fingerprints, retina, palm, face, hand shape and iris, and behavioral biometric data such as gait, smartphone screen movement, keyboard pressing style, or driving style.
The guidance emphasizes that biometric data processing must comply with Article 4 general principles and Article 6 special category processing conditions. It also highlights proportionality and notes that processing may not be proportional where an alternative exists.
For fintech, this means biometric onboarding must be justified case by case. A high-risk banking account may justify stronger identity verification. A low-risk loyalty account may not justify storing biometric templates. The product risk, regulatory obligation, fraud risk, and available alternatives should be evaluated together.
15. Remote Onboarding for Payment and E-Money Institutions
Payment institutions and electronic money institutions are important fintech actors in Turkey. They operate under Law No. 6493 and must obtain CBRT operating licenses. TÖDEB explains that payment institutions and electronic money institutions are required to obtain an operating license from the CBRT, and that they are subject to CBRT supervision and MASAK liability audits.
Remote onboarding for payment and e-money services must be designed with payment services regulation, AML/KYC obligations, fraud prevention, and KVKK compliance. A wallet provider that allows low-value transactions may not need the same onboarding intensity as a bank account with broader transaction limits, but it must still manage identity, risk, transaction monitoring, and customer data lawfully.
Payment and e-money onboarding should address:
Customer identity verification
Risk-based limits
AML screening
Wallet fraud prevention
Device verification
Strong authentication
Privacy notice
Explicit consent for biometric data where used
Vendor security
Complaint channels
Record retention
The onboarding process should match the risk profile of the wallet or payment service. Overly weak verification invites fraud; overly intrusive verification creates privacy risk.
16. Remote Onboarding for Crypto Asset Service Providers
Crypto asset service providers have become more formally regulated in Turkey. The CMB’s Communiqué III-35/B.1 provides that crypto asset service providers may conduct onboarding via remote identification under relevant provisions of Communiqué III-42.1, while also imposing rules on crypto asset service providers’ establishment and operation.
Remote identity verification is especially important in crypto because crypto transfers can be fast, cross-border, and irreversible. Weak onboarding may allow fraudsters, illegal betting networks, mule accounts, and money launderers to use platforms.
Crypto onboarding should address:
Legal identity verification
Sanctions and PEP screening
Wallet address risk
Source of funds checks
Device and IP risk
Deepfake detection
Liveness detection
Fraud blacklists
Travel Rule data where applicable
Suspicious transaction monitoring
Withdrawal risk controls
Account takeover prevention
Crypto platforms should also inform users that identity verification is not only a registration formality. It is part of AML, security, withdrawal protection, and regulatory compliance.
17. AI-Based Identity Verification
Many fintech companies use AI-based identity verification tools. These tools may read identity documents, compare face images, detect liveness, flag forgery, analyze video quality, detect deepfakes, or score fraud risk.
AI can improve onboarding, but it creates legal risks:
False rejection of legitimate users
False acceptance of fraudsters
Bias against certain groups
Opaque decision-making
Excessive biometric processing
Vendor reuse of data
Cross-border transfer of identity data
Insufficient human review
Inability to explain automated decisions
Poor model testing
Under KVKK Article 11, a data subject may object to a result against them arising from analysis solely through automated systems. Therefore, if a fintech rejects onboarding based solely on AI scoring, it should have a human review or appeal mechanism.
AI tools should be governed through model documentation, vendor due diligence, bias testing, audit logs, incident response, and proportionality assessment.
18. Deepfake and Presentation Attack Risk
Remote identity verification must address presentation attacks and deepfakes. Fraudsters may use printed ID copies, altered documents, stolen identity photos, face masks, replayed videos, screen recordings, synthetic media, or AI-generated faces to pass onboarding.
Fintech companies should consider:
Liveness detection
Random movement instructions
NFC chip verification
Document security feature checks
Video call supervision
Device fingerprinting
Duplicate identity detection
High-risk pattern monitoring
Manual escalation
Post-onboarding transaction monitoring
The BRSA regulation’s requirements on real-time video calls, document tilting, security element checks, NFC verification, OTP confirmation, and trained representatives show the importance of layered controls.
No single control is perfect. Remote onboarding should combine document, biometric, behavioral, device, and transaction-level controls.
19. Outsourcing Identity Verification
Many fintech companies outsource identity verification to vendors. Vendors may provide OCR, NFC reading, face matching, liveness detection, biometric templates, fraud scoring, sanctions screening, or video call infrastructure.
Outsourcing creates legal responsibility issues. Under KVKK Article 12, if personal data is processed by another person on behalf of the controller, the data controller is jointly responsible for security measures. Therefore, vendor contracts must be detailed.
A vendor agreement should cover:
Data controller and processor roles
Processing purpose
Data categories
Security controls
Subprocessors
Cross-border transfers
Retention and deletion
Audit rights
Incident notification
Biometric data restrictions
No vendor reuse for model training unless lawfully structured
Regulatory cooperation
Service levels
Liability and indemnity
Termination support
A fintech company should not use a vendor that cannot explain where biometric data is stored, who accesses it, or whether it is used to train models.
20. Cross-Border Transfer Risks
Identity verification often involves international vendors and cloud infrastructure. ID images, biometric templates, liveness videos, logs, and fraud scores may be processed abroad. This raises KVKK Article 9 issues.
KVKK Article 9 was amended in 2024 and now allows transfer abroad where one of the Article 5 or Article 6 processing conditions exists and there is an adequacy decision for the relevant country, sector, or international organization. In the absence of adequacy, transfers may be possible through appropriate safeguards if data subjects can exercise their rights and access effective remedies.
For digital onboarding, cross-border transfer analysis should cover:
Cloud storage location
Backup location
Remote support access
Vendor subprocessors
AI model training location
Log processing location
Biometric template storage
Security operations centers
Contractual safeguards
Data subject rights
Onward transfers
A fintech company should not assume that using a global KYC vendor automatically satisfies Turkish data transfer rules.
21. Retention of Identity Verification Data
Identity verification records may need to be retained for AML, regulatory, contract, audit, dispute, or fraud prevention purposes. However, retention must still comply with KVKK. Article 4 requires personal data to be stored only for the period laid down by relevant legislation or required for the purpose of processing.
The KVKK biometric guidance also states that the maximum period for processing should be determined, and that raw and derived biometric records must be processed only for the required time, with reasons explained in the retention and destruction policy.
Fintech companies should distinguish between:
Identity document copies
Video call recordings
Face images
Biometric templates
Liveness test outputs
NFC verification results
Audit logs
Risk scores
Consent records
AML records
Rejected application records
Not all data needs the same retention period. Raw biometric data should not be kept indefinitely unless there is a clear legal basis and necessity.
22. Customer Experience and Legal Validity
Remote onboarding must be legally valid and user-friendly. If the process is too complex, customers abandon it. If it is too simple, fraud risk increases. If consent screens are unclear, KVKK risk increases. If the contract approval process is weak, enforceability may be questioned.
A fintech onboarding flow should ensure:
Clear explanation before data collection
Accessible privacy notice
Separate biometric consent where required
Readable contract terms
Transaction-specific OTP
Strong authentication
User confirmation of identity and contract
Clear error messages
Alternative process where legally possible
Accessibility for persons with disabilities
Complaint and support channels
The BRSA regulation requires client representatives to receive training to serve individuals with disabilities. This reflects a broader principle: digital onboarding should not exclude users unfairly.
23. Liability for Wrong Identification
If a fintech company incorrectly identifies a person, several disputes may arise. A fraudster may open an account in someone else’s name. A legitimate customer may be rejected. Funds may be transferred through a mule account. A customer may suffer credit damage, crypto loss, or enforcement problems.
Liability may arise from:
Weak document verification
Failure to detect forged ID
Poor video quality
Inadequate representative training
Faulty AI matching
No liveness detection
Vendor error
Insufficient risk assessment
Failure to review repeated attempts
Data breach enabling identity theft
Inadequate account monitoring after onboarding
The company’s best defense is evidence: onboarding logs, video records, NFC verification results, consent records, risk assessment outputs, customer representative notes, system test records, and periodic review documentation.
24. Practical Compliance Checklist for Digital Identity Verification in Turkish Fintech
A fintech company should consider:
Classify the regulated service before designing onboarding.
Identify applicable regulator: BRSA, CBRT, CMB, MASAK, KVKK Authority, or others.
Determine whether remote identification rules apply.
Design a risk-based onboarding process.
Use NFC verification where appropriate.
Use video call or liveness controls where required.
Train representatives where human verification is used.
Prepare process documentation.
Test onboarding effectiveness before launch.
Review the process at least periodically and after fraud incidents.
Prepare KVKK privacy notices.
Obtain explicit consent for biometric data where required.
Avoid unnecessary biometric processing.
Map data flows and vendors.
Review cross-border transfers.
Limit retention of raw biometric data.
Implement encryption, access control, and audit logs.
Prepare breach response procedures.
Offer human review for automated rejection.
Preserve onboarding evidence.
Review vendor contracts.
Prepare customer complaint workflows.
Monitor fraud patterns after onboarding.
This checklist must be adapted to each business model. A bank, payment institution, e-money wallet, crypto platform, investment app, digital lender, InsurTech provider, and BaaS interface provider will not have identical obligations.
Why Legal Support Is Important
Digital identity verification requires legal support because it combines financial regulation, AML/KYC, biometric data, KVKK, outsourcing, cybersecurity, consumer protection, contract validity, and platform liability. A technically impressive onboarding tool may still be unlawful if it processes biometric data without proper basis, transfers data abroad without safeguards, or fails to meet remote identification rules.
A fintech lawyer can assist with:
Remote onboarding legal analysis
BRSA remote identification compliance
CBRT payment and e-money onboarding review
CMB crypto onboarding review
AML/KYC process mapping
KVKK privacy notice drafting
Biometric explicit consent design
Vendor contract review
Cross-border transfer assessment
Data retention policy
Cybersecurity and incident clauses
Automated decision review
Customer complaint strategy
Regulatory correspondence
Identity fraud dispute defense
Legal review should begin before selecting vendors and designing the user journey. Once biometric onboarding data has already been collected unlawfully, remediation may be difficult, costly, and reputationally damaging.
Conclusion
Digital identity verification is a foundation of Turkish fintech. It enables remote onboarding, digital banking, wallet creation, crypto trading, digital lending, open banking, and online financial contracts. However, it is also one of the highest-risk areas of fintech compliance because it involves identity documents, biometric data, video calls, fraud prevention, AML/KYC, automated decisions, and sensitive personal data.
Turkey allows remote onboarding in regulated financial services, but it does not allow uncontrolled onboarding. The BRSA banking regulation requires documented processes, effectiveness testing, periodic review, trained representatives, risk assessment, secure real-time video calls, OTP confirmation, NFC identity checks where possible, and electronically recorded explicit consent for biometric data used in remote identification.
KVKK adds another layer. Biometric data is special category personal data, processing must satisfy Article 6 conditions, adequate measures must be taken, data subjects must be informed, security must be ensured, and personal data must be limited, proportionate, and retained only as necessary.
The practical message is clear: remote onboarding is not just a growth tool. It is a regulated identity, privacy, and security process. Fintech companies must balance fast onboarding with strong fraud prevention and strict personal data protection. The safest systems are those that collect only necessary data, verify identity through layered controls, obtain clear consent where required, protect biometric data, document every step, review vendors carefully, and preserve evidence for disputes and audits.
In Turkish fintech, trust begins at onboarding. A customer who entrusts a platform with identity documents, biometric data, and financial access expects legal compliance, security, and transparency. Companies that build digital identity verification with legal architecture from the beginning will be better positioned to satisfy regulators, prevent fraud, protect users, and scale sustainably.
Yanıt yok