1

I'm trying to create a Function App to be triggered by blob storage events, and would like to retrieve the file metadata fields to use as parameters to file processing logic.

But when I declare @BlobTrigger in a BlobClient blobClient, raises the exception below. If I change to String content (or byte[]) the trigger works as expected.

I tried to remove gson as a indirect dependency in pom.xml, also tried to declare jackson to be the default serialized, but no success.

I managed to create by hand the blobClient and reach the file metadata, but I think that's not the right approach.

Am I missing something ?

Thanks in advance

  • azure-functions-java-library : 3.1.0
  • azure-storage-blob : 12.28.0
  • Azure Functions Core Tools Core Tools Version: 4.0.7512 Commit hash: N/A +8be8cc84f6ad64c784e083bf4da7fa381bdd3449 (64-bit) Function Runtime Version: 4.1040.300.25317

Reference: Azure Blob storage trigger for Azure Functions

Code snippet:

    @FunctionName("processBlob")
    public void run(
            @BlobTrigger(name = "content", path = "%GATEWAY_CONTAINER_NAME%%GATEWAY_SOURCE_FOLDER%/{fileName}", connection = "SourceBlogStorageConnStr", dataType = "binary") BlobClient blobClient,
            @BindingName("fileName") String fileName,
            ExecutionContext ctx) {

        ctx.getLogger().info("Size = " + blobClient.getProperties().getBlobSize());
        ctx.getLogger().info("Filename = " + fileName);
        ctx.getLogger().info("Metadata = " + blobClient.getProperties().getMetadata().entrySet());
    }

Output:

Executing 'Functions.processBlob' (Reason='New blob detected(LogsAndContainerScan): my_blob_storage/my_folder/my_file.csv', Id=bb2845c7-9d8c-4ecd-b6e0-21b42b73a345)
[2025-07-09T20:58:04.183Z] Trigger Details: MessageId: 28459ab1-e4de-4c2d-9bfb-901df92c32de, DequeueCount: 1, InsertedOn: 2025-07-09T20:58:03.000+00:00, BlobCreated: 2025-07-09T17:55:50.000+00:00, BlobLastModified: 2025-07-09T20:58:02.000+00:00
[2025-07-09T20:58:04.209Z] Executed 'Functions.processBlob' (Failed, Id=bb2845c7-9d8c-4ecd-b6e0-21b42b73a345, Duration=30ms)
[2025-07-09T20:58:04.209Z] System.Private.CoreLib: Exception while executing function: Functions.processBlob. System.Private.CoreLib: Result: Failure
[2025-07-09T20:58:04.209Z] Exception: IllegalArgumentException: Class com.azure.storage.blob.specialized.BlockBlobClient declares multiple JSON fields named 'client'; conflict is caused by fields com.azure.storage.blob.specialized.BlockBlobClient#client and com.azure.storage.blob.specialized.BlobClientBase#client
[2025-07-09T20:58:04.209Z] See https://github.com/google/gson/blob/main/Troubleshooting.md#duplicate-fields
[2025-07-09T20:58:04.210Z] Stack: java.lang.IllegalArgumentException: Class com.azure.storage.blob.specialized.BlockBlobClient declares multiple JSON fields named 'client'; conflict is caused by fields com.azure.storage.blob.specialized.BlockBlobClient#client and com.azure.storage.blob.specialized.BlobClientBase#client
[2025-07-09T20:58:04.210Z] See https://github.com/google/gson/blob/main/Troubleshooting.md#duplicate-fields
[2025-07-09T20:58:04.210Z]      at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory.createDuplicateFieldException(ReflectiveTypeAdapterFactory.java:313)
[2025-07-09T20:58:04.210Z]      at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory.getBoundFields(ReflectiveTypeAdapterFactory.java:409)
[2025-07-09T20:58:04.210Z]      at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory.create(ReflectiveTypeAdapterFactory.java:161)
[2025-07-09T20:58:04.210Z]      at com.google.gson.Gson.getAdapter(Gson.java:628)
[2025-07-09T20:58:04.210Z]      at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory.createBoundField(ReflectiveTypeAdapterFactory.java:201)
[2025-07-09T20:58:04.210Z]      at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory.getBoundFields(ReflectiveTypeAdapterFactory.java:395)
[2025-07-09T20:58:04.210Z]      at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory.create(ReflectiveTypeAdapterFactory.java:161)
[2025-07-09T20:58:04.210Z]      at com.google.gson.Gson.getAdapter(Gson.java:628)
[2025-07-09T20:58:04.211Z]      at com.google.gson.Gson.fromJson(Gson.java:1360)
[2025-07-09T20:58:04.211Z]      at com.google.gson.Gson.fromJson(Gson.java:1262)
[2025-07-09T20:58:04.211Z]      at com.google.gson.Gson.fromJson(Gson.java:1171)
[2025-07-09T20:58:04.211Z]      at com.google.gson.Gson.fromJson(Gson.java:1137)
[2025-07-09T20:58:04.211Z]      at com.microsoft.azure.functions.worker.binding.DataOperations.convertFromJson(DataOperations.java:158)
[2025-07-09T20:58:04.211Z]      at com.microsoft.azure.functions.worker.binding.DataOperations.apply(DataOperations.java:114)
[2025-07-09T20:58:04.211Z]      at com.microsoft.azure.functions.worker.binding.DataSource.computeByType(DataSource.java:56)
[2025-07-09T20:58:04.211Z]      at com.microsoft.azure.functions.worker.binding.RpcStringDataSource.computeByType(RpcStringDataSource.java:5)
[2025-07-09T20:58:04.211Z]      at com.microsoft.azure.functions.worker.binding.DataSource.computeByName(DataSource.java:42)
[2025-07-09T20:58:04.211Z]      at com.microsoft.azure.functions.worker.binding.RpcStringDataSource.computeByName(RpcStringDataSource.java:5)
[2025-07-09T20:58:04.211Z]      at com.microsoft.azure.functions.worker.binding.BindingDataStore.getDataByName(BindingDataStore.java:66)
[2025-07-09T20:58:04.212Z]      at com.microsoft.azure.functions.worker.binding.ExecutionContextDataSource.getBindingData(ExecutionContextDataSource.java:181)
[2025-07-09T20:58:04.212Z]      at com.microsoft.azure.functions.worker.broker.ParameterResolver.resolve(ParameterResolver.java:44)
[2025-07-09T20:58:04.212Z]      at com.microsoft.azure.functions.worker.broker.ParameterResolver.resolveArguments(ParameterResolver.java:22)
[2025-07-09T20:58:04.212Z]      at com.microsoft.azure.functions.worker.broker.EnhancedJavaMethodExecutorImpl.execute(EnhancedJavaMethodExecutorImpl.java:20)
[2025-07-09T20:58:04.212Z]      at com.microsoft.azure.functions.worker.chain.FunctionExecutionMiddleware.invoke(FunctionExecutionMiddleware.java:19)
[2025-07-09T20:58:04.212Z]      at com.microsoft.azure.functions.worker.chain.InvocationChain.doNext(InvocationChain.java:21)
[2025-07-09T20:58:04.212Z]      at com.microsoft.azure.functions.worker.broker.JavaFunctionBroker.invokeMethod(JavaFunctionBroker.java:195)
[2025-07-09T20:58:04.212Z]      at com.microsoft.azure.functions.worker.handler.InvocationRequestHandler.execute(InvocationRequestHandler.java:34)
[2025-07-09T20:58:04.212Z]      at com.microsoft.azure.functions.worker.handler.InvocationRequestHandler.execute(InvocationRequestHandler.java:10)
[2025-07-09T20:58:04.212Z]      at com.microsoft.azure.functions.worker.handler.MessageHandler.handle(MessageHandler.java:44)
[2025-07-09T20:58:04.213Z]      at com.microsoft.azure.functions.worker.JavaWorkerClient$StreamingMessagePeer.lambda$onNext$0(JavaWorkerClient.java:94)
[2025-07-09T20:58:04.213Z]      at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
[2025-07-09T20:58:04.213Z]      at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[2025-07-09T20:58:04.213Z]      at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
[2025-07-09T20:58:04.213Z]      at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
[2025-07-09T20:58:04.213Z]      at java.base/java.lang.Thread.run(Thread.java:840)

This works:

    @FunctionName("processBlob")
    public void run(
            @BlobTrigger(name = "content", path = "%GATEWAY_CONTAINER_NAME%%GATEWAY_SOURCE_FOLDER%/{fileName}", connection = "SourceBlogStorageConnStr", dataType = "string") String content,
                    @BindingName("fileName") String fileName,
            ExecutionContext ctx) {

        BlobServiceClient blobServiceClient = new BlobServiceClientBuilder()
                .connectionString(System.getenv("GATEWAY_CONN_STRING"))
                .buildClient();

        BlobContainerClient containerClient = blobServiceClient
                .getBlobContainerClient(
                        System.getenv("GATEWAY_CONTAINER_NAME") + System.getenv("GATEWAY_SOURCE_FOLDER"));

        BlobClient blobClient = containerClient.getBlobClient(fileName);

        ctx.getLogger().info("Size = " + blobClient.getProperties().getBlobSize());
        ctx.getLogger().info("Filename = " + fileName);
        ctx.getLogger().info("Metadata = " + blobClient.getProperties().getMetadata().entrySet());
    }

[2025-07-09T21:37:02.435Z] Executing 'Functions.processBlob' (Reason='New blob detected(LogsAndContainerScan): my_blob_storage/my_folder/my_file.csv', Id=bb2845c7-9d8c-4ecd-b6e0-21b42b73a345)
[2025-07-09T21:37:02.435Z] Trigger Details: MessageId: 4726d7c3-558e-4d52-81f1-88fda86286c6, DequeueCount: 1, InsertedOn: 2025-07-09T21:37:02.000+00:00, BlobCreated: 2025-07-09T17:55:50.000+00:00, BlobLastModified: 2025-07-09T21:37:01.000+00:00
[2025-07-09T21:37:02.608Z] SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
[2025-07-09T21:37:02.609Z] SLF4J: Defaulting to no-operation (NOP) logger implementation
[2025-07-09T21:37:02.609Z] SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[2025-07-09T21:37:08.580Z] Size = 1236484
[2025-07-09T21:37:08.581Z] Filename = my_file.csv
[2025-07-09T21:37:08.581Z] Metadata = [Metadata3=value, Metadata4=value, Metadata5=value, Metadata1=value, Metadata2=value]
[2025-07-09T21:37:08.581Z] Function "processBlob" (Id: 1dc69e2a-682e-47ca-9805-2094f1ab1cf5) invoked by Java Worker
[2025-07-09T21:37:08.591Z] Executed 'Functions.processBlob' (Succeeded, Id=1dc69e2a-682e-47ca-9805-2094f1ab1cf5, Duration=6196ms)
7
  • This most likely occurs because BlobClient extends BlobClientBase, and both classes declare a client field. I doubt these classes were ever intended to serve as JSON DTOs (though I'm not sure about other formats that may allow non-unique field names). That said, I would strongly advise against (de)serializing objects whose structure you don't control. The moment their definition changes, your code is at risk. Commented Jul 10 at 13:49
  • If modifying the incoming JSON structure is not an option, create local mapping classes (e.g., BlobClientDto as the root class) that extract only the data you actually need. I'm not sure whether it's possible to reconstruct a full BlobClient instance from such a DTO (I'm not familiar with its API), but judging by your example, you likely don't need the entire object. In that case, your mapping can be very concise, containing just the essential fields. Commented Jul 10 at 13:49
  • If you have control over both ends, serialization and deserialization, then replacing BlobClient entirely with your own shared DTOs, BlobClientDto, is ideal. While this might result in slightly more boilerplate, it significantly improves safety and maintainability. Commented Jul 10 at 13:49
  • BlobClient is a complex object, from Azure SDK. I don't have control on that generation, I'm only declaring the method as oriented in Microsoft documentation. Commented Jul 10 at 17:02
  • What does happen if you change the type of the content parameter to a custom type? Commented Jul 10 at 22:01

2 Answers 2

1

I managed to obtain the expected behavior, just missing SDK parameters required configuration described in a different page, in Requirements.

  • Set the JAVA_ENABLE_SDK_TYPES app setting to true to enable SDK types.

  • azure-functions-maven-plugin (or Gradle plug-in) version 1.38.0 or a higher version.

Sign up to request clarification or add additional context in comments.

2 Comments

Your answer could be improved with additional supporting information. Please edit to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center.
Wow! That's absolutely an unexpected solution! Congrats and glad to see you managed to resolve the issue in a concise way!
0

I've never used the Java SDK, but could this be the culprit?

dataType = "binary"

You have this in the trigger, could you try removing it? The samples in docs don't have it.

1 Comment

Thanks for the reply ! I already tried for the first time, removed now but raised the same exception.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.