i have to run gdal commands from python in subprocesses. The first command i tried is gdalsrsinfo.
self.proc = subprocess.Popen(['gdalsrsinfo', '-e', 'D:\tif\dop20c_33328_5650.tif'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT,shell=True)
while self.proc.poll() is None:
output = self.proc.stdout.readline().strip()
self.print_log.emit(output)
if output.startswith(b'EPSG:'):
epsg = output.rsplit(':',1)[1]
self.print_log.emit(output)
Since the while loop is entered, the process starts. But the output variable is always empty. When i execute the same command in my shell, i get the correct output. How can i get the same output from my shell in python? I also tries other gdal commands with the result. I don't get any output.
I'm working on Windows 10 with Python 3.7
By the way, the output of the command in my shell looks like this:
A:\>gdalsrsinfo -e D:\tif\dop20c_33328_5650.tif
EPSG:25833
PROJ.4 : +proj=utm +zone=33 +ellps=GRS80 +towgs84=0,0,0,0,0,0,0 +units=m +no_defs
OGC WKT2:2018 :
PROJCRS["ETRS89 / UTM zone 33N",
BASEGEOGCRS["ETRS89",
DATUM["European Terrestrial Reference System 1989",
ELLIPSOID["GRS 1980",6378137,298.257222101,
LENGTHUNIT["metre",1]]],
PRIMEM["Greenwich",0,
ANGLEUNIT["degree",0.0174532925199433]],
ID["EPSG",4258]],
CONVERSION["UTM zone 33N",
METHOD["Transverse Mercator",
ID["EPSG",9807]],
PARAMETER["Latitude of natural origin",0,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8801]],
PARAMETER["Longitude of natural origin",15,
ANGLEUNIT["degree",0.0174532925199433],
ID["EPSG",8802]],
PARAMETER["Scale factor at natural origin",0.9996,
SCALEUNIT["unity",1],
ID["EPSG",8805]],
PARAMETER["False easting",500000,
LENGTHUNIT["metre",1],
ID["EPSG",8806]],
PARAMETER["False northing",0,
LENGTHUNIT["metre",1],
ID["EPSG",8807]]],
CS[Cartesian,2],
AXIS["(E)",east,
ORDER[1],
LENGTHUNIT["metre",1]],
AXIS["(N)",north,
ORDER[2],
LENGTHUNIT["metre",1]],
USAGE[
SCOPE["unknown"],
AREA["Europe - 12°E to 18°E and ETRS89 by country"],
BBOX[46.4,12,84.01,18.01]],
ID["EPSG",25833]]