geopandas bug ?

Peter Otten __peter__ at web.de
Sat Jan 14 12:30:23 EST 2017


Xristos Xristoou wrote:

> Τη Σάββατο, 14 Ιανουαρίου 2017 - 6:33:54 μ.μ. UTC+2, ο χρήστης Peter Otten
> έγραψε:
>> Xristos Xristoou wrote:
>> 
>> >> I suggest that you file a bug report.
>> > 
>> > Mr.Peter Otten do you see my shapefiles ?have instersection 100 to 100
>> > i use instersection on QGIS ad work fine
>> 
>> Yes, I downloaded the zipfile at
>> 
>> > https://www.dropbox.com/s/2693nfi248z0y9q/files.zip?dl=0 with the
>> 
>> and when I ran your code I got the very error that you saw. There are
>> many NaN values in your data, so if it works elsewhere perhaps the data
>> is corrupted in some way. I'm sorry I cannot help you any further.
>> 
>> Good luck!
> 
> one more question,i have a idea what is wrong,but if my code work how to
> export spatial join "pointInPoly" to new shapefile ?

You can find an object's methods in the interactive interpreter with dir()

>>> dir(pointInPoly)
['T', '_AXIS_ALIASES', '_AXIS_IALIASES', '_AXIS_LEN', '_AXIS_NAMES', 
'_AXIS_NUMBERS', '_AXIS_ORDERS', '_AXIS_REVERSED', '_AXIS_SLICEMAP', 

<snip>

'rmod', 'rmul', 'rotate', 'rpow', 'rsub', 'rtruediv', 'save', 'scale', 
'select', 'set_geometry', 'set_index', 'set_value', 'shape', 'shift', 
'simplify', 'sindex', 'skew', 'sort', 'sort_index', 'sortlevel', 'squeeze', 
'stack', 'std', 'sub', 'subtract', 'sum', 'swapaxes', 'swaplevel', 
'symmetric_difference', 'tail', 'take', 'to_clipboard', 'to_crs', 'to_csv', 
'to_dense', 'to_dict', 'to_excel', 'to_file', 'to_gbq', 'to_hdf', 'to_html', 
'to_json', 'to_latex', 'to_msgpack', 'to_panel', 'to_period', 'to_pickle', 
'to_records', 'to_sparse', 'to_sql', 'to_stata', 'to_string', 
'to_timestamp', 'to_wide', 'total_bounds', 'touches', 'translate', 
'transpose', 'truediv', 'truncate', 'tshift', 'type', 'tz_convert', 
'tz_localize', 'unary_union', 'union', 'unstack', 'update', 'values', 'var', 
'where', 'within', 'xs']

OK, that's quite a lot, but to_file() seems to be a good candidate. Let's 
see what it does:

>>> help(pointInPoly.to_file)
Help on method to_file in module geopandas.geodataframe:

to_file(filename, driver='ESRI Shapefile', schema=None, **kwargs) metod of 
geopandas.geodataframe.GeoDataFrame instance
    Write this GeoDataFrame to an OGR data source
    
    A dictionary of supported OGR providers is available via:
    >>> import fiona
    >>> fiona.supported_drivers
    
    Parameters
    ----------
    filename : string
        File path or file handle to write to.
    driver : string, default 'ESRI Shapefile'
        The OGR format driver used to write the vector file.
    schema : dict, default None
        If specified, the schema dictionary is passed to Fiona to
        better control how the file is written.
    
    The *kwargs* are passed to fiona.open and can be used to write
    to multi-layer data, store data within archives (zip files), etc.

Looks good, run it:

>>> pointInPoly.to_file("point_in_poly")

No error. Does it round-trip?

>>> pointInPoly == gpd.GeoDataFrame.from_file("point_in_poly")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python3/dist-packages/pandas/core/ops.py", line 875, in f
    return self._compare_frame(other, func, str_rep)
  File "/usr/lib/python3/dist-packages/pandas/core/frame.py", line 2860, in 
_compare_frame
    raise ValueError('Can only compare identically-labeled '
ValueError: Can only compare identically-labeled DataFrame objects

Ouch, unfortunately not. Upon further inspection:

>>> pip.columns
Index(['geometry', 'index_righ', 'name_left', 'name_right'], dtype='object')
>>> pointInPoly.columns
Index(['geometry', 'name_left', 'index_right', 'name_right'], 
dtype='object')

Looks like column names are either corrupted or limited to 10 characters by 
default. Again I don't know how to overcome this, but as a special service 
here's the first hit for 'shp file column name limit' on a popular search 
engine:

http://gis.stackexchange.com/questions/15784/how-to-bypass-10-character-limit-of-field-name-in-shapefiles

I'm out of this now.





More information about the Python-list mailing list