I am working on an ADF pipeline to export a CSV file from a Snowflake table. The table contains some fields with special characters (i.e. French language characters)
If I look at the CSV output in notepad, it looks perfect.
However, when opening in Excel, it is not showing the characters correctly.(ex: "é" appears as "é")
If I change the encoding in Notepad to UTF-8-BOM, then Excel opens it correctly as expected. It also shows as expected if I import the data into Excel using the "Get Data" feature.
I do not see an option in ADF under encoding to set it to UTF-8-BOM, only UTF-8 or UTF-8 without BOM.
My question is basically this: How can I export the file to CSV format in such a way that the special characters will show correctly when a typical end user tries to open the CSV file in Excel?
Below is an example of how the file is currently configured.




