Category: Troubleshooting Last Updated: 2026-02-12

---

Overview

Crypto-Asset Service Providers with large user bases or high transaction volumes may generate DAC8 report files that are extremely large. Large files can cause problems during generation, validation, and submission, including memory errors, upload timeouts, and rejection by tax authority portals. This article covers the common challenges associated with large DAC8 files and provides practical strategies for managing them.

---

Problem: File Exceeds the Tax Authority's Size Limit

Cause

Many tax authority submission portals impose maximum file size limits. These limits vary by jurisdiction but can range from tens of megabytes to a few hundred megabytes. A report covering thousands or tens of thousands of Reportable Users with detailed transaction records can easily exceed these thresholds.

Fix

  1. Check the applicable limit. Before generating your report, verify the maximum file size accepted by the target tax authority's submission portal. This information is typically documented in the authority's technical guidance or submission instructions.
  2. Split the report into multiple files. If your file exceeds the limit, divide the report into smaller parts. Each part should be a valid, self-contained XML file that conforms to the DAC8 schema. See the section on splitting reports below for details.
  3. Use compression if supported. Some submission portals accept compressed files (typically ZIP or GZIP format). If the portal supports compression, compress the XML file before uploading. This can reduce file size significantly, as XML is highly compressible.

---

Problem: Report Generation Fails Due to Memory Constraints

Cause

Generating a large XML file in memory (for example, by building a complete DOM tree before writing to disk) can exhaust available memory, especially on systems with limited resources.

Fix

  1. Use streaming XML generation. Instead of building the entire document in memory, use a streaming approach (such as SAX-based writing or a streaming XML library) that writes elements to disk as they are generated.
  2. Process data in batches. Fetch and process Reportable User data in batches rather than loading all records into memory at once.
  3. Allocate sufficient resources. If streaming is not feasible in your current architecture, ensure the report generation process has access to adequate memory. This may involve running the process on a dedicated server or allocating additional resources temporarily.

---

Problem: Validation Takes Too Long or Times Out

Cause

XSD validation of a very large XML file can be time-consuming, particularly if the validator loads the entire file into memory or performs multiple passes.

Fix

  1. Use a streaming validator. Some XML validation tools support streaming validation, which processes the file incrementally rather than loading it all at once.
  2. Validate a representative sample first. Before validating the complete file, extract a small subset of records into a separate file and validate that subset. This can catch most schema issues quickly without the overhead of full-file validation.
  3. Validate locally before submission. Run validation on your own infrastructure rather than relying solely on the tax authority's portal, which may have shorter timeout limits.

---

Splitting Reports into Multiple Files

When splitting a large report, follow these guidelines:

Maintain Schema Compliance

Each split file must be a complete, valid XML document. This means each file must include:

  • The XML declaration and correct namespace declarations.
  • A valid MessageSpec section with a unique MessageRefId.
  • One or more ReportableUser records with their associated data.

Assign Unique Message Reference IDs

Each split file must have its own unique MessageRefId. Do not reuse the same MessageRefId across multiple files, as this may cause the tax authority to treat them as duplicate submissions.

Distribute Records Logically

  • Assign each Reportable User to exactly one file. Do not split a single user's data across multiple files, as this can complicate corrections and reconciliation.
  • If a single user's data is so large that it alone approaches the file size limit, check whether the tax authority has guidance on handling this edge case.

Track the Split

Maintain an internal record of how the report was split, including which users and records are in each file. This is essential for:

  • Submitting corrections later (you need to know which file contained the original record).
  • Reconciling acknowledgments from the tax authority.
  • Internal auditing.

---

Batch Submission Strategies

Sequential Submission

Submit files one at a time, waiting for acknowledgment of each before submitting the next. This approach is slower but ensures you can address any rejection before proceeding.

Parallel Submission

If the tax authority's portal supports it, submit multiple files simultaneously. Verify that the portal does not have rate limits or concurrent upload restrictions that could cause rejections.

Scheduled Submission

If you have a large number of files, consider scheduling submissions over a period of time (for example, spreading uploads across several hours or days before the deadline). This reduces the risk of encountering portal congestion near the filing deadline.

---

Compression Best Practices

If the tax authority accepts compressed files:

  1. Use the supported format. Typically ZIP or GZIP. Do not use formats that the portal may not recognize (such as RAR or 7z).
  2. Compress each XML file individually. Place one XML file per archive unless the portal specifically requires or allows multiple files in a single archive.
  3. Verify the compressed file. After compression, decompress the file and re-validate the XML to ensure the compression process did not corrupt the data.
  4. Check the compressed file size. Some portals have size limits on the compressed file, not just the uncompressed XML. Verify which limit applies.

---

Proactive Measures

  • Estimate file sizes early. Before the reporting deadline, estimate the expected file size based on your user count and transaction volume. This gives you time to plan for splitting or compression.
  • Test with realistic data volumes. Include file size testing as part of your reporting preparation. Generate test files with realistic record counts and verify that your entire pipeline (generation, validation, submission) handles them correctly.
  • Maintain headroom. Aim to keep individual file sizes well below the maximum limit to allow for some variation in data volume.

---

Disclaimer

This article provides general technical guidance. File size limits, compression requirements, and submission portal capabilities vary by jurisdiction. Always consult the relevant tax authority's technical documentation for specific requirements.

Need help with DAC8 reporting?

Our team handles XML generation, TIN validation, and submission for CASPs across all 27 EU Member States.

Get Expert Help