Updated bundled version of PyGeoIP
This commit is contained in:
parent
487ee03e6c
commit
b0e43c0aea
|
@ -0,0 +1,165 @@
|
||||||
|
GNU LESSER GENERAL PUBLIC LICENSE
|
||||||
|
Version 3, 29 June 2007
|
||||||
|
|
||||||
|
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
|
||||||
|
Everyone is permitted to copy and distribute verbatim copies
|
||||||
|
of this license document, but changing it is not allowed.
|
||||||
|
|
||||||
|
|
||||||
|
This version of the GNU Lesser General Public License incorporates
|
||||||
|
the terms and conditions of version 3 of the GNU General Public
|
||||||
|
License, supplemented by the additional permissions listed below.
|
||||||
|
|
||||||
|
0. Additional Definitions.
|
||||||
|
|
||||||
|
As used herein, "this License" refers to version 3 of the GNU Lesser
|
||||||
|
General Public License, and the "GNU GPL" refers to version 3 of the GNU
|
||||||
|
General Public License.
|
||||||
|
|
||||||
|
"The Library" refers to a covered work governed by this License,
|
||||||
|
other than an Application or a Combined Work as defined below.
|
||||||
|
|
||||||
|
An "Application" is any work that makes use of an interface provided
|
||||||
|
by the Library, but which is not otherwise based on the Library.
|
||||||
|
Defining a subclass of a class defined by the Library is deemed a mode
|
||||||
|
of using an interface provided by the Library.
|
||||||
|
|
||||||
|
A "Combined Work" is a work produced by combining or linking an
|
||||||
|
Application with the Library. The particular version of the Library
|
||||||
|
with which the Combined Work was made is also called the "Linked
|
||||||
|
Version".
|
||||||
|
|
||||||
|
The "Minimal Corresponding Source" for a Combined Work means the
|
||||||
|
Corresponding Source for the Combined Work, excluding any source code
|
||||||
|
for portions of the Combined Work that, considered in isolation, are
|
||||||
|
based on the Application, and not on the Linked Version.
|
||||||
|
|
||||||
|
The "Corresponding Application Code" for a Combined Work means the
|
||||||
|
object code and/or source code for the Application, including any data
|
||||||
|
and utility programs needed for reproducing the Combined Work from the
|
||||||
|
Application, but excluding the System Libraries of the Combined Work.
|
||||||
|
|
||||||
|
1. Exception to Section 3 of the GNU GPL.
|
||||||
|
|
||||||
|
You may convey a covered work under sections 3 and 4 of this License
|
||||||
|
without being bound by section 3 of the GNU GPL.
|
||||||
|
|
||||||
|
2. Conveying Modified Versions.
|
||||||
|
|
||||||
|
If you modify a copy of the Library, and, in your modifications, a
|
||||||
|
facility refers to a function or data to be supplied by an Application
|
||||||
|
that uses the facility (other than as an argument passed when the
|
||||||
|
facility is invoked), then you may convey a copy of the modified
|
||||||
|
version:
|
||||||
|
|
||||||
|
a) under this License, provided that you make a good faith effort to
|
||||||
|
ensure that, in the event an Application does not supply the
|
||||||
|
function or data, the facility still operates, and performs
|
||||||
|
whatever part of its purpose remains meaningful, or
|
||||||
|
|
||||||
|
b) under the GNU GPL, with none of the additional permissions of
|
||||||
|
this License applicable to that copy.
|
||||||
|
|
||||||
|
3. Object Code Incorporating Material from Library Header Files.
|
||||||
|
|
||||||
|
The object code form of an Application may incorporate material from
|
||||||
|
a header file that is part of the Library. You may convey such object
|
||||||
|
code under terms of your choice, provided that, if the incorporated
|
||||||
|
material is not limited to numerical parameters, data structure
|
||||||
|
layouts and accessors, or small macros, inline functions and templates
|
||||||
|
(ten or fewer lines in length), you do both of the following:
|
||||||
|
|
||||||
|
a) Give prominent notice with each copy of the object code that the
|
||||||
|
Library is used in it and that the Library and its use are
|
||||||
|
covered by this License.
|
||||||
|
|
||||||
|
b) Accompany the object code with a copy of the GNU GPL and this license
|
||||||
|
document.
|
||||||
|
|
||||||
|
4. Combined Works.
|
||||||
|
|
||||||
|
You may convey a Combined Work under terms of your choice that,
|
||||||
|
taken together, effectively do not restrict modification of the
|
||||||
|
portions of the Library contained in the Combined Work and reverse
|
||||||
|
engineering for debugging such modifications, if you also do each of
|
||||||
|
the following:
|
||||||
|
|
||||||
|
a) Give prominent notice with each copy of the Combined Work that
|
||||||
|
the Library is used in it and that the Library and its use are
|
||||||
|
covered by this License.
|
||||||
|
|
||||||
|
b) Accompany the Combined Work with a copy of the GNU GPL and this license
|
||||||
|
document.
|
||||||
|
|
||||||
|
c) For a Combined Work that displays copyright notices during
|
||||||
|
execution, include the copyright notice for the Library among
|
||||||
|
these notices, as well as a reference directing the user to the
|
||||||
|
copies of the GNU GPL and this license document.
|
||||||
|
|
||||||
|
d) Do one of the following:
|
||||||
|
|
||||||
|
0) Convey the Minimal Corresponding Source under the terms of this
|
||||||
|
License, and the Corresponding Application Code in a form
|
||||||
|
suitable for, and under terms that permit, the user to
|
||||||
|
recombine or relink the Application with a modified version of
|
||||||
|
the Linked Version to produce a modified Combined Work, in the
|
||||||
|
manner specified by section 6 of the GNU GPL for conveying
|
||||||
|
Corresponding Source.
|
||||||
|
|
||||||
|
1) Use a suitable shared library mechanism for linking with the
|
||||||
|
Library. A suitable mechanism is one that (a) uses at run time
|
||||||
|
a copy of the Library already present on the user's computer
|
||||||
|
system, and (b) will operate properly with a modified version
|
||||||
|
of the Library that is interface-compatible with the Linked
|
||||||
|
Version.
|
||||||
|
|
||||||
|
e) Provide Installation Information, but only if you would otherwise
|
||||||
|
be required to provide such information under section 6 of the
|
||||||
|
GNU GPL, and only to the extent that such information is
|
||||||
|
necessary to install and execute a modified version of the
|
||||||
|
Combined Work produced by recombining or relinking the
|
||||||
|
Application with a modified version of the Linked Version. (If
|
||||||
|
you use option 4d0, the Installation Information must accompany
|
||||||
|
the Minimal Corresponding Source and Corresponding Application
|
||||||
|
Code. If you use option 4d1, you must provide the Installation
|
||||||
|
Information in the manner specified by section 6 of the GNU GPL
|
||||||
|
for conveying Corresponding Source.)
|
||||||
|
|
||||||
|
5. Combined Libraries.
|
||||||
|
|
||||||
|
You may place library facilities that are a work based on the
|
||||||
|
Library side by side in a single library together with other library
|
||||||
|
facilities that are not Applications and are not covered by this
|
||||||
|
License, and convey such a combined library under terms of your
|
||||||
|
choice, if you do both of the following:
|
||||||
|
|
||||||
|
a) Accompany the combined library with a copy of the same work based
|
||||||
|
on the Library, uncombined with any other library facilities,
|
||||||
|
conveyed under the terms of this License.
|
||||||
|
|
||||||
|
b) Give prominent notice with the combined library that part of it
|
||||||
|
is a work based on the Library, and explaining where to find the
|
||||||
|
accompanying uncombined form of the same work.
|
||||||
|
|
||||||
|
6. Revised Versions of the GNU Lesser General Public License.
|
||||||
|
|
||||||
|
The Free Software Foundation may publish revised and/or new versions
|
||||||
|
of the GNU Lesser General Public License from time to time. Such new
|
||||||
|
versions will be similar in spirit to the present version, but may
|
||||||
|
differ in detail to address new problems or concerns.
|
||||||
|
|
||||||
|
Each version is given a distinguishing version number. If the
|
||||||
|
Library as you received it specifies that a certain numbered version
|
||||||
|
of the GNU Lesser General Public License "or any later version"
|
||||||
|
applies to it, you have the option of following the terms and
|
||||||
|
conditions either of that published version or of any later version
|
||||||
|
published by the Free Software Foundation. If the Library as you
|
||||||
|
received it does not specify a version number of the GNU Lesser
|
||||||
|
General Public License, you may choose any version of the GNU Lesser
|
||||||
|
General Public License ever published by the Free Software Foundation.
|
||||||
|
|
||||||
|
If the Library as you received it specifies that a proxy can decide
|
||||||
|
whether future versions of the GNU Lesser General Public License shall
|
||||||
|
apply, that proxy's public statement of acceptance of any version is
|
||||||
|
permanent authorization for you to choose that version for the
|
||||||
|
Library.
|
|
@ -0,0 +1,21 @@
|
||||||
|
Bootstrap manual for developers of pygeoip
|
||||||
|
|
||||||
|
Dependencies: tox, nose, epydoc
|
||||||
|
|
||||||
|
For testing we are using tox virtualenv-based Python version testing
|
||||||
|
and nose as test framwork.
|
||||||
|
|
||||||
|
Tox will create virtualenvs for all Python version pygeoip supports
|
||||||
|
and installs the current working tree using the setup.py install script.
|
||||||
|
Running the tests requires a couple of sample databases found on the
|
||||||
|
link below.
|
||||||
|
|
||||||
|
Maxmind sample databases for testing can be downloaded here:
|
||||||
|
http://www.defunct.cc/maxmind-geoip-samples.tar.gz (58 MB)
|
||||||
|
|
||||||
|
Extract the tarball in the tests directory and run tox from the root directory.
|
||||||
|
|
||||||
|
Please make sure your code passes all tests before opening pull requests.
|
||||||
|
|
||||||
|
All the best,
|
||||||
|
William Tisäter
|
|
@ -1,17 +1,13 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
"""
|
"""
|
||||||
Pure Python GeoIP API. The API is based off of U{MaxMind's C-based Python API<http://www.maxmind.com/app/python>},
|
Pure Python GeoIP API
|
||||||
but the code itself is based on the U{pure PHP5 API<http://pear.php.net/package/Net_GeoIP/>}
|
|
||||||
by Jim Winstead and Hans Lellelid.
|
|
||||||
|
|
||||||
It is mostly a drop-in replacement, except the
|
The API is based on MaxMind's C-based Python API, but the code itself is
|
||||||
C{new} and C{open} methods are gone. You should instantiate the L{GeoIP} class yourself:
|
ported from the Pure PHP GeoIP API by Jim Winstead and Hans Lellelid.
|
||||||
|
|
||||||
C{gi = GeoIP('/path/to/GeoIP.dat', pygeoip.MEMORY_CACHE)}
|
@author: Jennifer Ennis <zaylea@gmail.com>
|
||||||
|
|
||||||
@author: Jennifer Ennis <zaylea at gmail dot com>
|
@license: Copyright(C) 2004 MaxMind LLC
|
||||||
|
|
||||||
@license:
|
|
||||||
Copyright(C) 2004 MaxMind LLC
|
|
||||||
|
|
||||||
This program is free software: you can redistribute it and/or modify
|
This program is free software: you can redistribute it and/or modify
|
||||||
it under the terms of the GNU Lesser General Public License as published by
|
it under the terms of the GNU Lesser General Public License as published by
|
||||||
|
@ -27,39 +23,43 @@ You should have received a copy of the GNU Lesser General Public License
|
||||||
along with this program. If not, see <http://www.gnu.org/licenses/lgpl.txt>.
|
along with this program. If not, see <http://www.gnu.org/licenses/lgpl.txt>.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
from __future__ import with_statement, absolute_import, division
|
|
||||||
import os
|
import os
|
||||||
import math
|
import math
|
||||||
import socket
|
import socket
|
||||||
import mmap
|
import mmap
|
||||||
import gzip
|
|
||||||
import codecs
|
import codecs
|
||||||
from StringIO import StringIO
|
from threading import Lock
|
||||||
|
|
||||||
from . import const
|
try:
|
||||||
from .util import ip2long
|
from StringIO import StringIO
|
||||||
from .timezone import time_zone_by_country_and_region
|
except ImportError:
|
||||||
|
from io import StringIO, BytesIO
|
||||||
|
|
||||||
import six
|
from pygeoip import util, const
|
||||||
|
from pygeoip.const import PY2, PY3
|
||||||
|
from pygeoip.timezone import time_zone_by_country_and_region
|
||||||
|
|
||||||
|
|
||||||
|
STANDARD = const.STANDARD
|
||||||
MMAP_CACHE = const.MMAP_CACHE
|
MMAP_CACHE = const.MMAP_CACHE
|
||||||
MEMORY_CACHE = const.MEMORY_CACHE
|
MEMORY_CACHE = const.MEMORY_CACHE
|
||||||
STANDARD = const.STANDARD
|
|
||||||
|
ENCODING = const.ENCODING
|
||||||
|
|
||||||
|
|
||||||
class GeoIPError(Exception):
|
class GeoIPError(Exception):
|
||||||
pass
|
pass
|
||||||
|
|
||||||
class GeoIPMetaclass(type):
|
|
||||||
|
|
||||||
|
class GeoIPMetaclass(type):
|
||||||
def __new__(cls, *args, **kwargs):
|
def __new__(cls, *args, **kwargs):
|
||||||
"""
|
"""
|
||||||
Singleton method to gets an instance without reparsing the db. Unique
|
Singleton method to gets an instance without reparsing the db. Unique
|
||||||
instances are instantiated based on the filename of the db. Flags are
|
instances are instantiated based on the filename of the db. Flags are
|
||||||
ignored for this, i.e. if you initialize one with STANDARD flag (default)
|
ignored for this, i.e. if you initialize one with STANDARD
|
||||||
and then try later to initialize with MEMORY_CACHE, it will still
|
flag (default) and then try later to initialize with MEMORY_CACHE, it
|
||||||
return the STANDARD one.
|
will still return the STANDARD one.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
if not hasattr(cls, '_instances'):
|
if not hasattr(cls, '_instances'):
|
||||||
cls._instances = {}
|
cls._instances = {}
|
||||||
|
|
||||||
|
@ -68,25 +68,25 @@ class GeoIPMetaclass(type):
|
||||||
elif 'filename' in kwargs:
|
elif 'filename' in kwargs:
|
||||||
filename = kwargs['filename']
|
filename = kwargs['filename']
|
||||||
|
|
||||||
if not filename in cls._instances:
|
if filename not in cls._instances:
|
||||||
cls._instances[filename] = type.__new__(cls, *args, **kwargs)
|
cls._instances[filename] = type.__new__(cls, *args, **kwargs)
|
||||||
|
|
||||||
return cls._instances[filename]
|
return cls._instances[filename]
|
||||||
|
|
||||||
|
|
||||||
GeoIPBase = GeoIPMetaclass('GeoIPBase', (object,), {})
|
GeoIPBase = GeoIPMetaclass('GeoIPBase', (object,), {})
|
||||||
|
|
||||||
class GeoIP(GeoIPBase):
|
|
||||||
|
|
||||||
|
class GeoIP(GeoIPBase):
|
||||||
def __init__(self, filename, flags=0):
|
def __init__(self, filename, flags=0):
|
||||||
"""
|
"""
|
||||||
Initialize the class.
|
Initialize the class.
|
||||||
|
|
||||||
@param filename: path to a geoip database. If MEMORY_CACHE is used,
|
@param filename: Path to a geoip database.
|
||||||
the file can be gzipped.
|
|
||||||
@type filename: str
|
@type filename: str
|
||||||
@param flags: flags that affect how the database is processed.
|
@param flags: Flags that affect how the database is processed.
|
||||||
Currently the only supported flags are STANDARD (the default),
|
Currently supported flags are STANDARD (the default),
|
||||||
MEMORY_CACHE (preload the whole file into memory), and
|
MEMORY_CACHE (preload the whole file into memory) and
|
||||||
MMAP_CACHE (access the file via mmap).
|
MMAP_CACHE (access the file via mmap).
|
||||||
@type flags: int
|
@type flags: int
|
||||||
"""
|
"""
|
||||||
|
@ -94,42 +94,71 @@ class GeoIP(GeoIPBase):
|
||||||
self._flags = flags
|
self._flags = flags
|
||||||
|
|
||||||
if self._flags & const.MMAP_CACHE:
|
if self._flags & const.MMAP_CACHE:
|
||||||
with open(filename, 'rb') as f:
|
f = open(filename, 'rb')
|
||||||
self._filehandle = mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ)
|
access = mmap.ACCESS_READ
|
||||||
|
self._filehandle = mmap.mmap(f.fileno(), 0, access=access)
|
||||||
|
f.close()
|
||||||
|
|
||||||
elif self._flags & const.MEMORY_CACHE:
|
elif self._flags & const.MEMORY_CACHE:
|
||||||
if filename.endswith('.gz'):
|
f = open(filename, 'rb')
|
||||||
opener = gzip.open
|
self._memoryBuffer = f.read()
|
||||||
else:
|
iohandle = BytesIO if PY3 else StringIO
|
||||||
opener = open
|
self._filehandle = iohandle(self._memoryBuffer)
|
||||||
|
f.close()
|
||||||
|
|
||||||
with opener(filename, 'rb') as f:
|
|
||||||
self._memoryBuffer = f.read()
|
|
||||||
self._filehandle = StringIO(self._memoryBuffer)
|
|
||||||
else:
|
else:
|
||||||
self._filehandle = codecs.open(filename, 'rb','latin_1')
|
self._filehandle = codecs.open(filename, 'rb', ENCODING)
|
||||||
|
|
||||||
|
self._lock = Lock()
|
||||||
self._setup_segments()
|
self._setup_segments()
|
||||||
|
|
||||||
def _setup_segments(self):
|
def _setup_segments(self):
|
||||||
"""
|
"""
|
||||||
Parses the database file to determine what kind of database is being used and setup
|
Parses the database file to determine what kind of database is
|
||||||
segment sizes and start points that will be used by the seek*() methods later.
|
being used and setup segment sizes and start points that will
|
||||||
|
be used by the seek*() methods later.
|
||||||
|
|
||||||
|
Supported databases:
|
||||||
|
|
||||||
|
* COUNTRY_EDITION
|
||||||
|
* COUNTRY_EDITION_V6
|
||||||
|
* REGION_EDITION_REV0
|
||||||
|
* REGION_EDITION_REV1
|
||||||
|
* CITY_EDITION_REV0
|
||||||
|
* CITY_EDITION_REV1
|
||||||
|
* CITY_EDITION_REV1_V6
|
||||||
|
* ORG_EDITION
|
||||||
|
* ISP_EDITION
|
||||||
|
* ASNUM_EDITION
|
||||||
|
* ASNUM_EDITION_V6
|
||||||
|
|
||||||
"""
|
"""
|
||||||
self._databaseType = const.COUNTRY_EDITION
|
self._databaseType = const.COUNTRY_EDITION
|
||||||
self._recordLength = const.STANDARD_RECORD_LENGTH
|
self._recordLength = const.STANDARD_RECORD_LENGTH
|
||||||
|
self._databaseSegments = const.COUNTRY_BEGIN
|
||||||
|
|
||||||
|
self._lock.acquire()
|
||||||
filepos = self._filehandle.tell()
|
filepos = self._filehandle.tell()
|
||||||
self._filehandle.seek(-3, os.SEEK_END)
|
self._filehandle.seek(-3, os.SEEK_END)
|
||||||
|
|
||||||
for i in range(const.STRUCTURE_INFO_MAX_SIZE):
|
for i in range(const.STRUCTURE_INFO_MAX_SIZE):
|
||||||
|
chars = chr(255) * 3
|
||||||
delim = self._filehandle.read(3)
|
delim = self._filehandle.read(3)
|
||||||
|
|
||||||
if delim == six.u(chr(255) * 3):
|
if PY3 and type(delim) is bytes:
|
||||||
self._databaseType = ord(self._filehandle.read(1))
|
delim = delim.decode(ENCODING)
|
||||||
|
|
||||||
|
if PY2:
|
||||||
|
chars = chars.decode(ENCODING)
|
||||||
|
if type(delim) is str:
|
||||||
|
delim = delim.decode(ENCODING)
|
||||||
|
|
||||||
|
if delim == chars:
|
||||||
|
byte = self._filehandle.read(1)
|
||||||
|
self._databaseType = ord(byte)
|
||||||
|
|
||||||
|
# Compatibility with databases from April 2003 and earlier
|
||||||
if (self._databaseType >= 106):
|
if (self._databaseType >= 106):
|
||||||
# backwards compatibility with databases from April 2003 and earlier
|
|
||||||
self._databaseType -= 105
|
self._databaseType -= 105
|
||||||
|
|
||||||
if self._databaseType == const.REGION_EDITION_REV0:
|
if self._databaseType == const.REGION_EDITION_REV0:
|
||||||
|
@ -140,51 +169,29 @@ class GeoIP(GeoIPBase):
|
||||||
|
|
||||||
elif self._databaseType in (const.CITY_EDITION_REV0,
|
elif self._databaseType in (const.CITY_EDITION_REV0,
|
||||||
const.CITY_EDITION_REV1,
|
const.CITY_EDITION_REV1,
|
||||||
|
const.CITY_EDITION_REV1_V6,
|
||||||
const.ORG_EDITION,
|
const.ORG_EDITION,
|
||||||
const.ISP_EDITION,
|
const.ISP_EDITION,
|
||||||
const.ASNUM_EDITION):
|
const.ASNUM_EDITION,
|
||||||
|
const.ASNUM_EDITION_V6):
|
||||||
self._databaseSegments = 0
|
self._databaseSegments = 0
|
||||||
buf = self._filehandle.read(const.SEGMENT_RECORD_LENGTH)
|
buf = self._filehandle.read(const.SEGMENT_RECORD_LENGTH)
|
||||||
|
|
||||||
|
if PY3 and type(buf) is bytes:
|
||||||
|
buf = buf.decode(ENCODING)
|
||||||
|
|
||||||
for j in range(const.SEGMENT_RECORD_LENGTH):
|
for j in range(const.SEGMENT_RECORD_LENGTH):
|
||||||
self._databaseSegments += (ord(buf[j]) << (j * 8))
|
self._databaseSegments += (ord(buf[j]) << (j * 8))
|
||||||
|
|
||||||
if self._databaseType in (const.ORG_EDITION, const.ISP_EDITION):
|
LONG_RECORDS = (const.ORG_EDITION, const.ISP_EDITION)
|
||||||
|
if self._databaseType in LONG_RECORDS:
|
||||||
self._recordLength = const.ORG_RECORD_LENGTH
|
self._recordLength = const.ORG_RECORD_LENGTH
|
||||||
|
|
||||||
break
|
break
|
||||||
else:
|
else:
|
||||||
self._filehandle.seek(-4, os.SEEK_CUR)
|
self._filehandle.seek(-4, os.SEEK_CUR)
|
||||||
|
|
||||||
if self._databaseType == const.COUNTRY_EDITION:
|
|
||||||
self._databaseSegments = const.COUNTRY_BEGIN
|
|
||||||
|
|
||||||
self._filehandle.seek(filepos, os.SEEK_SET)
|
self._filehandle.seek(filepos, os.SEEK_SET)
|
||||||
|
self._lock.release()
|
||||||
def _lookup_country_id(self, addr):
|
|
||||||
"""
|
|
||||||
Get the country index.
|
|
||||||
|
|
||||||
This method is called by the _lookupCountryCode and _lookupCountryName
|
|
||||||
methods. It looks up the index ('id') for the country which is the key
|
|
||||||
for the code and name.
|
|
||||||
|
|
||||||
@param addr: The IP address
|
|
||||||
@type addr: str
|
|
||||||
@return: network byte order 32-bit integer
|
|
||||||
@rtype: int
|
|
||||||
"""
|
|
||||||
|
|
||||||
ipnum = ip2long(addr)
|
|
||||||
|
|
||||||
if not ipnum:
|
|
||||||
raise ValueError("Invalid IP address: %s" % addr)
|
|
||||||
|
|
||||||
if self._databaseType != const.COUNTRY_EDITION:
|
|
||||||
raise GeoIPError('Invalid database type; country_* methods expect '\
|
|
||||||
'Country database')
|
|
||||||
|
|
||||||
return self._seek_country(ipnum) - const.COUNTRY_BEGIN
|
|
||||||
|
|
||||||
def _seek_country(self, ipnum):
|
def _seek_country(self, ipnum):
|
||||||
"""
|
"""
|
||||||
|
@ -196,117 +203,119 @@ class GeoIP(GeoIPBase):
|
||||||
@return: offset of start of record
|
@return: offset of start of record
|
||||||
@rtype: int
|
@rtype: int
|
||||||
"""
|
"""
|
||||||
offset = 0
|
try:
|
||||||
|
offset = 0
|
||||||
|
seek_depth = 127 if len(str(ipnum)) > 10 else 31
|
||||||
|
|
||||||
for depth in range(31, -1, -1):
|
for depth in range(seek_depth, -1, -1):
|
||||||
|
if self._flags & const.MEMORY_CACHE:
|
||||||
|
startIndex = 2 * self._recordLength * offset
|
||||||
|
endIndex = startIndex + (2 * self._recordLength)
|
||||||
|
buf = self._memoryBuffer[startIndex:endIndex]
|
||||||
|
else:
|
||||||
|
startIndex = 2 * self._recordLength * offset
|
||||||
|
readLength = 2 * self._recordLength
|
||||||
|
self._lock.acquire()
|
||||||
|
self._filehandle.seek(startIndex, os.SEEK_SET)
|
||||||
|
buf = self._filehandle.read(readLength)
|
||||||
|
self._lock.release()
|
||||||
|
|
||||||
if self._flags & const.MEMORY_CACHE:
|
if PY3 and type(buf) is bytes:
|
||||||
startIndex = 2 * self._recordLength * offset
|
buf = buf.decode(ENCODING)
|
||||||
length = 2 * self._recordLength
|
|
||||||
endIndex = startIndex + length
|
|
||||||
buf = self._memoryBuffer[startIndex:endIndex]
|
|
||||||
else:
|
|
||||||
self._filehandle.seek(2 * self._recordLength * offset, os.SEEK_SET)
|
|
||||||
buf = self._filehandle.read(2 * self._recordLength)
|
|
||||||
|
|
||||||
x = [0,0]
|
x = [0, 0]
|
||||||
|
for i in range(2):
|
||||||
|
for j in range(self._recordLength):
|
||||||
|
byte = buf[self._recordLength * i + j]
|
||||||
|
x[i] += ord(byte) << (j * 8)
|
||||||
|
if ipnum & (1 << depth):
|
||||||
|
if x[1] >= self._databaseSegments:
|
||||||
|
return x[1]
|
||||||
|
offset = x[1]
|
||||||
|
else:
|
||||||
|
if x[0] >= self._databaseSegments:
|
||||||
|
return x[0]
|
||||||
|
offset = x[0]
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
for i in range(2):
|
raise GeoIPError('Corrupt database')
|
||||||
for j in range(self._recordLength):
|
|
||||||
x[i] += ord(buf[self._recordLength * i + j]) << (j * 8)
|
|
||||||
|
|
||||||
if ipnum & (1 << depth):
|
|
||||||
|
|
||||||
if x[1] >= self._databaseSegments:
|
|
||||||
return x[1]
|
|
||||||
|
|
||||||
offset = x[1]
|
|
||||||
|
|
||||||
else:
|
|
||||||
|
|
||||||
if x[0] >= self._databaseSegments:
|
|
||||||
return x[0]
|
|
||||||
|
|
||||||
offset = x[0]
|
|
||||||
|
|
||||||
|
|
||||||
raise Exception('Error traversing database - perhaps it is corrupt?')
|
|
||||||
|
|
||||||
def _get_org(self, ipnum):
|
def _get_org(self, ipnum):
|
||||||
"""
|
"""
|
||||||
Seek and return organization (or ISP) name for converted IP addr.
|
Seek and return organization or ISP name for ipnum.
|
||||||
@param ipnum: Converted IP address
|
@param ipnum: Converted IP address
|
||||||
@type ipnum: int
|
@type ipnum: int
|
||||||
@return: org/isp name
|
@return: org/isp name
|
||||||
@rtype: str
|
@rtype: str
|
||||||
"""
|
"""
|
||||||
|
|
||||||
seek_org = self._seek_country(ipnum)
|
seek_org = self._seek_country(ipnum)
|
||||||
if seek_org == self._databaseSegments:
|
if seek_org == self._databaseSegments:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
record_pointer = seek_org + (2 * self._recordLength - 1) * self._databaseSegments
|
read_length = (2 * self._recordLength - 1) * self._databaseSegments
|
||||||
|
self._lock.acquire()
|
||||||
|
self._filehandle.seek(seek_org + read_length, os.SEEK_SET)
|
||||||
|
buf = self._filehandle.read(const.MAX_ORG_RECORD_LENGTH)
|
||||||
|
self._lock.release()
|
||||||
|
|
||||||
self._filehandle.seek(record_pointer, os.SEEK_SET)
|
if PY3 and type(buf) is bytes:
|
||||||
|
buf = buf.decode(ENCODING)
|
||||||
|
|
||||||
org_buf = self._filehandle.read(const.MAX_ORG_RECORD_LENGTH)
|
return buf[:buf.index(chr(0))]
|
||||||
|
|
||||||
return org_buf[:org_buf.index(chr(0))]
|
|
||||||
|
|
||||||
def _get_region(self, ipnum):
|
def _get_region(self, ipnum):
|
||||||
"""
|
"""
|
||||||
Seek and return the region info (dict containing country_code and region_name).
|
Seek and return the region info (dict containing country_code
|
||||||
|
and region_name).
|
||||||
|
|
||||||
@param ipnum: converted IP address
|
@param ipnum: Converted IP address
|
||||||
@type ipnum: int
|
@type ipnum: int
|
||||||
@return: dict containing country_code and region_name
|
@return: dict containing country_code and region_name
|
||||||
@rtype: dict
|
@rtype: dict
|
||||||
"""
|
"""
|
||||||
country_code = ''
|
|
||||||
region = ''
|
region = ''
|
||||||
|
country_code = ''
|
||||||
|
seek_country = self._seek_country(ipnum)
|
||||||
|
|
||||||
|
def get_region_name(offset):
|
||||||
|
region1 = chr(offset // 26 + 65)
|
||||||
|
region2 = chr(offset % 26 + 65)
|
||||||
|
return ''.join([region1, region2])
|
||||||
|
|
||||||
if self._databaseType == const.REGION_EDITION_REV0:
|
if self._databaseType == const.REGION_EDITION_REV0:
|
||||||
seek_country = self._seek_country(ipnum)
|
|
||||||
seek_region = seek_country - const.STATE_BEGIN_REV0
|
seek_region = seek_country - const.STATE_BEGIN_REV0
|
||||||
if seek_region >= 1000:
|
if seek_region >= 1000:
|
||||||
country_code = 'US'
|
country_code = 'US'
|
||||||
region = ''.join([chr((seek_region // 1000) // 26 + 65), chr((seek_region // 1000) % 26 + 65)])
|
region = get_region_name(seek_region - 1000)
|
||||||
else:
|
else:
|
||||||
country_code = const.COUNTRY_CODES[seek_region]
|
country_code = const.COUNTRY_CODES[seek_region]
|
||||||
region = ''
|
|
||||||
elif self._databaseType == const.REGION_EDITION_REV1:
|
elif self._databaseType == const.REGION_EDITION_REV1:
|
||||||
seek_country = self._seek_country(ipnum)
|
|
||||||
seek_region = seek_country - const.STATE_BEGIN_REV1
|
seek_region = seek_country - const.STATE_BEGIN_REV1
|
||||||
if seek_region < const.US_OFFSET:
|
if seek_region < const.US_OFFSET:
|
||||||
country_code = '';
|
pass
|
||||||
region = ''
|
|
||||||
elif seek_region < const.CANADA_OFFSET:
|
elif seek_region < const.CANADA_OFFSET:
|
||||||
country_code = 'US'
|
country_code = 'US'
|
||||||
region = ''.join([chr((seek_region - const.US_OFFSET) // 26 + 65), chr((seek_region - const.US_OFFSET) % 26 + 65)])
|
region = get_region_name(seek_region - const.US_OFFSET)
|
||||||
elif seek_region < const.WORLD_OFFSET:
|
elif seek_region < const.WORLD_OFFSET:
|
||||||
country_code = 'CA'
|
country_code = 'CA'
|
||||||
region = ''.join([chr((seek_region - const.CANADA_OFFSET) // 26 + 65), chr((seek_region - const.CANADA_OFFSET) % 26 + 65)])
|
region = get_region_name(seek_region - const.CANADA_OFFSET)
|
||||||
else:
|
else:
|
||||||
i = (seek_region - const.WORLD_OFFSET) // const.FIPS_RANGE
|
index = (seek_region - const.WORLD_OFFSET) // const.FIPS_RANGE
|
||||||
if i < len(const.COUNTRY_CODES):
|
if index in const.COUNTRY_CODES:
|
||||||
#country_code = const.COUNTRY_CODES[(seek_region - const.WORLD_OFFSET) // const.FIPS_RANGE]
|
country_code = const.COUNTRY_CODES[index]
|
||||||
country_code = const.COUNTRY_CODES[i]
|
elif self._databaseType in const.CITY_EDITIONS:
|
||||||
else:
|
|
||||||
country_code = ''
|
|
||||||
region = ''
|
|
||||||
|
|
||||||
elif self._databaseType in (const.CITY_EDITION_REV0, const.CITY_EDITION_REV1):
|
|
||||||
rec = self._get_record(ipnum)
|
rec = self._get_record(ipnum)
|
||||||
country_code = rec['country_code'] if 'country_code' in rec else ''
|
region = rec.get('region_name', '')
|
||||||
region = rec['region_name'] if 'region_name' in rec else ''
|
country_code = rec.get('country_code', '')
|
||||||
|
|
||||||
return {'country_code' : country_code, 'region_name' : region }
|
return {'country_code': country_code, 'region_name': region}
|
||||||
|
|
||||||
def _get_record(self, ipnum):
|
def _get_record(self, ipnum):
|
||||||
"""
|
"""
|
||||||
Populate location dict for converted IP.
|
Populate location dict for converted IP.
|
||||||
|
|
||||||
@param ipnum: converted IP address
|
@param ipnum: Converted IP address
|
||||||
@type ipnum: int
|
@type ipnum: int
|
||||||
@return: dict with country_code, country_code3, country_name,
|
@return: dict with country_code, country_code3, country_name,
|
||||||
region, city, postal_code, latitude, longitude,
|
region, city, postal_code, latitude, longitude,
|
||||||
|
@ -315,107 +324,115 @@ class GeoIP(GeoIPBase):
|
||||||
"""
|
"""
|
||||||
seek_country = self._seek_country(ipnum)
|
seek_country = self._seek_country(ipnum)
|
||||||
if seek_country == self._databaseSegments:
|
if seek_country == self._databaseSegments:
|
||||||
return None
|
return {}
|
||||||
|
|
||||||
record_pointer = seek_country + (2 * self._recordLength - 1) * self._databaseSegments
|
read_length = (2 * self._recordLength - 1) * self._databaseSegments
|
||||||
|
self._lock.acquire()
|
||||||
|
self._filehandle.seek(seek_country + read_length, os.SEEK_SET)
|
||||||
|
buf = self._filehandle.read(const.FULL_RECORD_LENGTH)
|
||||||
|
self._lock.release()
|
||||||
|
|
||||||
self._filehandle.seek(record_pointer, os.SEEK_SET)
|
if PY3 and type(buf) is bytes:
|
||||||
record_buf = self._filehandle.read(const.FULL_RECORD_LENGTH)
|
buf = buf.decode(ENCODING)
|
||||||
|
|
||||||
record = {}
|
record = {
|
||||||
|
'dma_code': 0,
|
||||||
record_buf_pos = 0
|
'area_code': 0,
|
||||||
char = ord(record_buf[record_buf_pos])
|
'metro_code': '',
|
||||||
#char = record_buf[record_buf_pos] if six.PY3 else ord(record_buf[record_buf_pos])
|
'postal_code': ''
|
||||||
record['country_code'] = const.COUNTRY_CODES[char]
|
}
|
||||||
record['country_code3'] = const.COUNTRY_CODES3[char]
|
|
||||||
record['country_name'] = const.COUNTRY_NAMES[char]
|
|
||||||
record_buf_pos += 1
|
|
||||||
str_length = 0
|
|
||||||
|
|
||||||
# get region
|
|
||||||
char = ord(record_buf[record_buf_pos+str_length])
|
|
||||||
while (char != 0):
|
|
||||||
str_length += 1
|
|
||||||
char = ord(record_buf[record_buf_pos+str_length])
|
|
||||||
|
|
||||||
if str_length > 0:
|
|
||||||
record['region_name'] = record_buf[record_buf_pos:record_buf_pos+str_length]
|
|
||||||
|
|
||||||
record_buf_pos += str_length + 1
|
|
||||||
str_length = 0
|
|
||||||
|
|
||||||
# get city
|
|
||||||
char = ord(record_buf[record_buf_pos+str_length])
|
|
||||||
while (char != 0):
|
|
||||||
str_length += 1
|
|
||||||
char = ord(record_buf[record_buf_pos+str_length])
|
|
||||||
|
|
||||||
if str_length > 0:
|
|
||||||
record['city'] = record_buf[record_buf_pos:record_buf_pos+str_length]
|
|
||||||
else:
|
|
||||||
record['city'] = ''
|
|
||||||
|
|
||||||
record_buf_pos += str_length + 1
|
|
||||||
str_length = 0
|
|
||||||
|
|
||||||
# get the postal code
|
|
||||||
char = ord(record_buf[record_buf_pos+str_length])
|
|
||||||
while (char != 0):
|
|
||||||
str_length += 1
|
|
||||||
char = ord(record_buf[record_buf_pos+str_length])
|
|
||||||
|
|
||||||
if str_length > 0:
|
|
||||||
record['postal_code'] = record_buf[record_buf_pos:record_buf_pos+str_length]
|
|
||||||
else:
|
|
||||||
record['postal_code'] = None
|
|
||||||
|
|
||||||
record_buf_pos += str_length + 1
|
|
||||||
str_length = 0
|
|
||||||
|
|
||||||
latitude = 0
|
latitude = 0
|
||||||
longitude = 0
|
longitude = 0
|
||||||
|
buf_pos = 0
|
||||||
|
|
||||||
|
# Get country
|
||||||
|
char = ord(buf[buf_pos])
|
||||||
|
record['country_code'] = const.COUNTRY_CODES[char]
|
||||||
|
record['country_code3'] = const.COUNTRY_CODES3[char]
|
||||||
|
record['country_name'] = const.COUNTRY_NAMES[char]
|
||||||
|
record['continent'] = const.CONTINENT_NAMES[char]
|
||||||
|
|
||||||
|
buf_pos += 1
|
||||||
|
def get_data(buf, buf_pos):
|
||||||
|
offset = buf_pos
|
||||||
|
char = ord(buf[offset])
|
||||||
|
while (char != 0):
|
||||||
|
offset += 1
|
||||||
|
char = ord(buf[offset])
|
||||||
|
if offset > buf_pos:
|
||||||
|
return (offset, buf[buf_pos:offset])
|
||||||
|
return (offset, '')
|
||||||
|
|
||||||
|
offset, record['region_name'] = get_data(buf, buf_pos)
|
||||||
|
offset, record['city'] = get_data(buf, offset + 1)
|
||||||
|
offset, record['postal_code'] = get_data(buf, offset + 1)
|
||||||
|
buf_pos = offset + 1
|
||||||
|
|
||||||
for j in range(3):
|
for j in range(3):
|
||||||
char = ord(record_buf[record_buf_pos])
|
char = ord(buf[buf_pos])
|
||||||
record_buf_pos += 1
|
buf_pos += 1
|
||||||
latitude += (char << (j * 8))
|
latitude += (char << (j * 8))
|
||||||
|
|
||||||
record['latitude'] = (latitude/10000.0) - 180.0
|
|
||||||
|
|
||||||
for j in range(3):
|
for j in range(3):
|
||||||
char = ord(record_buf[record_buf_pos])
|
char = ord(buf[buf_pos])
|
||||||
record_buf_pos += 1
|
buf_pos += 1
|
||||||
longitude += (char << (j * 8))
|
longitude += (char << (j * 8))
|
||||||
|
|
||||||
record['longitude'] = (longitude/10000.0) - 180.0
|
record['latitude'] = (latitude / 10000.0) - 180.0
|
||||||
|
record['longitude'] = (longitude / 10000.0) - 180.0
|
||||||
|
|
||||||
if self._databaseType == const.CITY_EDITION_REV1:
|
if self._databaseType in (const.CITY_EDITION_REV1, const.CITY_EDITION_REV1_V6):
|
||||||
dmaarea_combo = 0
|
dmaarea_combo = 0
|
||||||
if record['country_code'] == 'US':
|
if record['country_code'] == 'US':
|
||||||
for j in range(3):
|
for j in range(3):
|
||||||
char = ord(record_buf[record_buf_pos])
|
char = ord(buf[buf_pos])
|
||||||
record_buf_pos += 1
|
dmaarea_combo += (char << (j * 8))
|
||||||
dmaarea_combo += (char << (j*8))
|
buf_pos += 1
|
||||||
|
|
||||||
record['dma_code'] = int(math.floor(dmaarea_combo/1000))
|
record['dma_code'] = int(math.floor(dmaarea_combo / 1000))
|
||||||
record['area_code'] = dmaarea_combo%1000
|
record['area_code'] = dmaarea_combo % 1000
|
||||||
else:
|
|
||||||
record['dma_code'] = 0
|
|
||||||
record['area_code'] = 0
|
|
||||||
|
|
||||||
if 'dma_code' in record and record['dma_code'] in const.DMA_MAP:
|
record['metro_code'] = const.DMA_MAP.get(record['dma_code'])
|
||||||
record['metro_code'] = const.DMA_MAP[record['dma_code']]
|
params = (record['country_code'], record['region_name'])
|
||||||
else:
|
record['time_zone'] = time_zone_by_country_and_region(*params)
|
||||||
record['metro_code'] = ''
|
|
||||||
|
|
||||||
if 'country_code' in record:
|
|
||||||
record['time_zone'] = time_zone_by_country_and_region(
|
|
||||||
record['country_code'], record.get('region_name')) or ''
|
|
||||||
else:
|
|
||||||
record['time_zone'] = ''
|
|
||||||
|
|
||||||
return record
|
return record
|
||||||
|
|
||||||
|
def _gethostbyname(self, hostname):
|
||||||
|
if self._databaseType in const.IPV6_EDITIONS:
|
||||||
|
try:
|
||||||
|
response = socket.getaddrinfo(hostname, 0, socket.AF_INET6)
|
||||||
|
family, socktype, proto, canonname, sockaddr = response[0]
|
||||||
|
address, port, flow, scope = sockaddr
|
||||||
|
return address
|
||||||
|
except socket.gaierror:
|
||||||
|
return ''
|
||||||
|
else:
|
||||||
|
return socket.gethostbyname(hostname)
|
||||||
|
|
||||||
|
def id_by_addr(self, addr):
|
||||||
|
"""
|
||||||
|
Get the country index.
|
||||||
|
Looks up the index for the country which is the key for
|
||||||
|
the code and name.
|
||||||
|
|
||||||
|
@param addr: The IP address
|
||||||
|
@type addr: str
|
||||||
|
@return: network byte order 32-bit integer
|
||||||
|
@rtype: int
|
||||||
|
"""
|
||||||
|
ipnum = util.ip2long(addr)
|
||||||
|
if not ipnum:
|
||||||
|
raise ValueError("Invalid IP address: %s" % addr)
|
||||||
|
|
||||||
|
COUNTY_EDITIONS = (const.COUNTRY_EDITION, const.COUNTRY_EDITION_V6)
|
||||||
|
if self._databaseType not in COUNTY_EDITIONS:
|
||||||
|
message = 'Invalid database type, expected Country'
|
||||||
|
raise GeoIPError(message)
|
||||||
|
|
||||||
|
return self._seek_country(ipnum) - const.COUNTRY_BEGIN
|
||||||
|
|
||||||
def country_code_by_addr(self, addr):
|
def country_code_by_addr(self, addr):
|
||||||
"""
|
"""
|
||||||
Returns 2-letter country code (e.g. 'US') for specified IP address.
|
Returns 2-letter country code (e.g. 'US') for specified IP address.
|
||||||
|
@ -427,31 +444,38 @@ class GeoIP(GeoIPBase):
|
||||||
@rtype: str
|
@rtype: str
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
if self._databaseType == const.COUNTRY_EDITION:
|
VALID_EDITIONS = (const.COUNTRY_EDITION, const.COUNTRY_EDITION_V6)
|
||||||
country_id = self._lookup_country_id(addr)
|
if self._databaseType in VALID_EDITIONS:
|
||||||
return const.COUNTRY_CODES[country_id]
|
ipv = 6 if addr.find(':') >= 0 else 4
|
||||||
elif self._databaseType in (const.REGION_EDITION_REV0, const.REGION_EDITION_REV1,
|
|
||||||
const.CITY_EDITION_REV0, const.CITY_EDITION_REV1):
|
|
||||||
return self.region_by_addr(addr)['country_code']
|
|
||||||
else:
|
|
||||||
raise GeoIPError('Invalid database type; country_* methods expect '\
|
|
||||||
'Country, City, or Region database')
|
|
||||||
|
|
||||||
|
if ipv == 4 and self._databaseType != const.COUNTRY_EDITION:
|
||||||
|
message = 'Invalid database type; expected IPv6 address'
|
||||||
|
raise ValueError(message)
|
||||||
|
if ipv == 6 and self._databaseType != const.COUNTRY_EDITION_V6:
|
||||||
|
message = 'Invalid database type; expected IPv4 address'
|
||||||
|
raise ValueError(message)
|
||||||
|
|
||||||
|
country_id = self.id_by_addr(addr)
|
||||||
|
return const.COUNTRY_CODES[country_id]
|
||||||
|
elif self._databaseType in const.REGION_CITY_EDITIONS:
|
||||||
|
return self.region_by_addr(addr).get('country_code')
|
||||||
|
|
||||||
|
message = 'Invalid database type, expected Country, City or Region'
|
||||||
|
raise GeoIPError(message)
|
||||||
except ValueError:
|
except ValueError:
|
||||||
raise GeoIPError('*_by_addr methods only accept IP addresses. Use *_by_name for hostnames. (Address: %s)' % addr)
|
raise GeoIPError('Failed to lookup address %s' % addr)
|
||||||
|
|
||||||
def country_code_by_name(self, hostname):
|
def country_code_by_name(self, hostname):
|
||||||
"""
|
"""
|
||||||
Returns 2-letter country code (e.g. 'US') for specified hostname.
|
Returns 2-letter country code (e.g. 'US') for specified hostname.
|
||||||
Use this method if you have a Country, Region, or City database.
|
Use this method if you have a Country, Region, or City database.
|
||||||
|
|
||||||
@param hostname: host name
|
@param hostname: Hostname
|
||||||
@type hostname: str
|
@type hostname: str
|
||||||
@return: 2-letter country code
|
@return: 2-letter country code
|
||||||
@rtype: str
|
@rtype: str
|
||||||
"""
|
"""
|
||||||
addr = socket.gethostbyname(hostname)
|
addr = self._gethostbyname(hostname)
|
||||||
|
|
||||||
return self.country_code_by_addr(addr)
|
return self.country_code_by_addr(addr)
|
||||||
|
|
||||||
def country_name_by_addr(self, addr):
|
def country_name_by_addr(self, addr):
|
||||||
|
@ -465,34 +489,35 @@ class GeoIP(GeoIPBase):
|
||||||
@rtype: str
|
@rtype: str
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
if self._databaseType == const.COUNTRY_EDITION:
|
VALID_EDITIONS = (const.COUNTRY_EDITION, const.COUNTRY_EDITION_V6)
|
||||||
country_id = self._lookup_country_id(addr)
|
if self._databaseType in VALID_EDITIONS:
|
||||||
|
country_id = self.id_by_addr(addr)
|
||||||
return const.COUNTRY_NAMES[country_id]
|
return const.COUNTRY_NAMES[country_id]
|
||||||
elif self._databaseType in (const.CITY_EDITION_REV0, const.CITY_EDITION_REV1):
|
elif self._databaseType in const.CITY_EDITIONS:
|
||||||
return self.record_by_addr(addr)['country_name']
|
return self.record_by_addr(addr).get('country_name')
|
||||||
else:
|
else:
|
||||||
raise GeoIPError('Invalid database type; country_* methods expect '\
|
message = 'Invalid database type, expected Country or City'
|
||||||
'Country or City database')
|
raise GeoIPError(message)
|
||||||
except ValueError:
|
except ValueError:
|
||||||
raise GeoIPError('*_by_addr methods only accept IP addresses. Use *_by_name for hostnames. (Address: %s)' % addr)
|
raise GeoIPError('Failed to lookup address %s' % addr)
|
||||||
|
|
||||||
def country_name_by_name(self, hostname):
|
def country_name_by_name(self, hostname):
|
||||||
"""
|
"""
|
||||||
Returns full country name for specified hostname.
|
Returns full country name for specified hostname.
|
||||||
Use this method if you have a Country database.
|
Use this method if you have a Country database.
|
||||||
|
|
||||||
@param hostname: host name
|
@param hostname: Hostname
|
||||||
@type hostname: str
|
@type hostname: str
|
||||||
@return: country name
|
@return: country name
|
||||||
@rtype: str
|
@rtype: str
|
||||||
"""
|
"""
|
||||||
addr = socket.gethostbyname(hostname)
|
addr = self._gethostbyname(hostname)
|
||||||
return self.country_name_by_addr(addr)
|
return self.country_name_by_addr(addr)
|
||||||
|
|
||||||
def org_by_addr(self, addr):
|
def org_by_addr(self, addr):
|
||||||
"""
|
"""
|
||||||
Lookup the organization (or ISP) for given IP address.
|
Lookup Organization, ISP or ASNum for given IP address.
|
||||||
Use this method if you have an Organization/ISP database.
|
Use this method if you have an Organization, ISP or ASNum database.
|
||||||
|
|
||||||
@param addr: IP address
|
@param addr: IP address
|
||||||
@type addr: str
|
@type addr: str
|
||||||
|
@ -500,31 +525,30 @@ class GeoIP(GeoIPBase):
|
||||||
@rtype: str
|
@rtype: str
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
ipnum = ip2long(addr)
|
ipnum = util.ip2long(addr)
|
||||||
|
|
||||||
if not ipnum:
|
if not ipnum:
|
||||||
raise ValueError("Invalid IP address: %s" % addr)
|
raise ValueError('Invalid IP address')
|
||||||
|
|
||||||
if self._databaseType not in (const.ORG_EDITION, const.ISP_EDITION, const.ASNUM_EDITION):
|
valid = (const.ORG_EDITION, const.ISP_EDITION, const.ASNUM_EDITION, const.ASNUM_EDITION_V6)
|
||||||
raise GeoIPError('Invalid database type; org_* methods expect '\
|
if self._databaseType not in valid:
|
||||||
'Org/ISP database')
|
message = 'Invalid database type, expected Org, ISP or ASNum'
|
||||||
|
raise GeoIPError(message)
|
||||||
|
|
||||||
return self._get_org(ipnum)
|
return self._get_org(ipnum)
|
||||||
except ValueError:
|
except ValueError:
|
||||||
raise GeoIPError('*_by_addr methods only accept IP addresses. Use *_by_name for hostnames. (Address: %s)' % addr)
|
raise GeoIPError('Failed to lookup address %s' % addr)
|
||||||
|
|
||||||
def org_by_name(self, hostname):
|
def org_by_name(self, hostname):
|
||||||
"""
|
"""
|
||||||
Lookup the organization (or ISP) for hostname.
|
Lookup the organization (or ISP) for hostname.
|
||||||
Use this method if you have an Organization/ISP database.
|
Use this method if you have an Organization/ISP database.
|
||||||
|
|
||||||
@param hostname: host name
|
@param hostname: Hostname
|
||||||
@type hostname: str
|
@type hostname: str
|
||||||
@return: organization or ISP name
|
@return: Organization or ISP name
|
||||||
@rtype: str
|
@rtype: str
|
||||||
"""
|
"""
|
||||||
addr = socket.gethostbyname(hostname)
|
addr = self._gethostbyname(hostname)
|
||||||
|
|
||||||
return self.org_by_addr(addr)
|
return self.org_by_addr(addr)
|
||||||
|
|
||||||
def record_by_addr(self, addr):
|
def record_by_addr(self, addr):
|
||||||
|
@ -534,38 +558,41 @@ class GeoIP(GeoIPBase):
|
||||||
|
|
||||||
@param addr: IP address
|
@param addr: IP address
|
||||||
@type addr: str
|
@type addr: str
|
||||||
@return: dict with country_code, country_code3, country_name,
|
@return: Dictionary with country_code, country_code3, country_name,
|
||||||
region, city, postal_code, latitude, longitude,
|
region, city, postal_code, latitude, longitude, dma_code,
|
||||||
dma_code, metro_code, area_code, region_name, time_zone
|
metro_code, area_code, region_name, time_zone
|
||||||
@rtype: dict
|
@rtype: dict
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
ipnum = ip2long(addr)
|
ipnum = util.ip2long(addr)
|
||||||
|
|
||||||
if not ipnum:
|
if not ipnum:
|
||||||
raise ValueError("Invalid IP address: %s" % addr)
|
raise ValueError('Invalid IP address')
|
||||||
|
|
||||||
if not self._databaseType in (const.CITY_EDITION_REV0, const.CITY_EDITION_REV1):
|
if self._databaseType not in const.CITY_EDITIONS:
|
||||||
raise GeoIPError('Invalid database type; record_* methods expect City database')
|
message = 'Invalid database type, expected City'
|
||||||
|
raise GeoIPError(message)
|
||||||
|
|
||||||
return self._get_record(ipnum)
|
rec = self._get_record(ipnum)
|
||||||
|
if not rec:
|
||||||
|
return None
|
||||||
|
|
||||||
|
return rec
|
||||||
except ValueError:
|
except ValueError:
|
||||||
raise GeoIPError('*_by_addr methods only accept IP addresses. Use *_by_name for hostnames. (Address: %s)' % addr)
|
raise GeoIPError('Failed to lookup address %s' % addr)
|
||||||
|
|
||||||
def record_by_name(self, hostname):
|
def record_by_name(self, hostname):
|
||||||
"""
|
"""
|
||||||
Look up the record for a given hostname.
|
Look up the record for a given hostname.
|
||||||
Use this method if you have a City database.
|
Use this method if you have a City database.
|
||||||
|
|
||||||
@param hostname: host name
|
@param hostname: Hostname
|
||||||
@type hostname: str
|
@type hostname: str
|
||||||
@return: dict with country_code, country_code3, country_name,
|
@return: Dictionary with country_code, country_code3, country_name,
|
||||||
region, city, postal_code, latitude, longitude,
|
region, city, postal_code, latitude, longitude, dma_code,
|
||||||
dma_code, metro_code, area_code, region_name, time_zone
|
metro_code, area_code, region_name, time_zone
|
||||||
@rtype: dict
|
@rtype: dict
|
||||||
"""
|
"""
|
||||||
addr = socket.gethostbyname(hostname)
|
addr = self._gethostbyname(hostname)
|
||||||
|
|
||||||
return self.record_by_addr(addr)
|
return self.record_by_addr(addr)
|
||||||
|
|
||||||
def region_by_addr(self, addr):
|
def region_by_addr(self, addr):
|
||||||
|
@ -575,37 +602,33 @@ class GeoIP(GeoIPBase):
|
||||||
|
|
||||||
@param addr: IP address
|
@param addr: IP address
|
||||||
@type addr: str
|
@type addr: str
|
||||||
@return: dict containing country_code, region,
|
@return: Dictionary containing country_code, region and region_name
|
||||||
and region_name
|
|
||||||
@rtype: dict
|
@rtype: dict
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
ipnum = ip2long(addr)
|
ipnum = util.ip2long(addr)
|
||||||
|
|
||||||
if not ipnum:
|
if not ipnum:
|
||||||
raise ValueError("Invalid IP address: %s" % addr)
|
raise ValueError('Invalid IP address')
|
||||||
|
|
||||||
if not self._databaseType in (const.REGION_EDITION_REV0, const.REGION_EDITION_REV1,
|
if self._databaseType not in const.REGION_CITY_EDITIONS:
|
||||||
const.CITY_EDITION_REV0, const.CITY_EDITION_REV1):
|
message = 'Invalid database type, expected Region or City'
|
||||||
raise GeoIPError('Invalid database type; region_* methods expect '\
|
raise GeoIPError(message)
|
||||||
'Region or City database')
|
|
||||||
|
|
||||||
return self._get_region(ipnum)
|
return self._get_region(ipnum)
|
||||||
except ValueError:
|
except ValueError:
|
||||||
raise GeoIPError('*_by_addr methods only accept IP addresses. Use *_by_name for hostnames. (Address: %s)' % addr)
|
raise GeoIPError('Failed to lookup address %s' % addr)
|
||||||
|
|
||||||
def region_by_name(self, hostname):
|
def region_by_name(self, hostname):
|
||||||
"""
|
"""
|
||||||
Lookup the region for given hostname.
|
Lookup the region for given hostname.
|
||||||
Use this method if you have a Region database.
|
Use this method if you have a Region database.
|
||||||
|
|
||||||
@param hostname: host name
|
@param hostname: Hostname
|
||||||
@type hostname: str
|
@type hostname: str
|
||||||
@return: dict containing country_code, region,
|
@return: Dictionary containing country_code, region, and region_name
|
||||||
and region_name
|
|
||||||
@rtype: dict
|
@rtype: dict
|
||||||
"""
|
"""
|
||||||
addr = socket.gethostbyname(hostname)
|
addr = self._gethostbyname(hostname)
|
||||||
return self.region_by_addr(addr)
|
return self.region_by_addr(addr)
|
||||||
|
|
||||||
def time_zone_by_addr(self, addr):
|
def time_zone_by_addr(self, addr):
|
||||||
|
@ -613,35 +636,33 @@ class GeoIP(GeoIPBase):
|
||||||
Look up the time zone for a given IP address.
|
Look up the time zone for a given IP address.
|
||||||
Use this method if you have a Region or City database.
|
Use this method if you have a Region or City database.
|
||||||
|
|
||||||
@param hostname: IP address
|
@param addr: IP address
|
||||||
@type hostname: str
|
@type addr: str
|
||||||
@return: Time zone
|
@return: Time zone
|
||||||
@rtype: str
|
@rtype: str
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
ipnum = ip2long(addr)
|
ipnum = util.ip2long(addr)
|
||||||
|
|
||||||
if not ipnum:
|
if not ipnum:
|
||||||
raise ValueError("Invalid IP address: %s" % addr)
|
raise ValueError('Invalid IP address')
|
||||||
|
|
||||||
if not self._databaseType in (const.REGION_EDITION_REV0, const.REGION_EDITION_REV1,
|
if self._databaseType not in const.CITY_EDITIONS:
|
||||||
const.CITY_EDITION_REV0, const.CITY_EDITION_REV1):
|
message = 'Invalid database type, expected City'
|
||||||
raise GeoIPError('Invalid database type; region_* methods expect '\
|
raise GeoIPError(message)
|
||||||
'Region or City database')
|
|
||||||
|
|
||||||
return self._get_record(ipnum)['time_zone']
|
return self._get_record(ipnum).get('time_zone')
|
||||||
except ValueError:
|
except ValueError:
|
||||||
raise GeoIPError('*_by_addr methods only accept IP addresses. Use *_by_name for hostnames. (Address: %s)' % addr)
|
raise GeoIPError('Failed to lookup address %s' % addr)
|
||||||
|
|
||||||
def time_zone_by_name(self, hostname):
|
def time_zone_by_name(self, hostname):
|
||||||
"""
|
"""
|
||||||
Look up the time zone for a given hostname.
|
Look up the time zone for a given hostname.
|
||||||
Use this method if you have a Region or City database.
|
Use this method if you have a Region or City database.
|
||||||
|
|
||||||
@param hostname: host name
|
@param hostname: Hostname
|
||||||
@type hostname: str
|
@type hostname: str
|
||||||
@return: Time zone
|
@return: Time zone
|
||||||
@rtype: str
|
@rtype: str
|
||||||
"""
|
"""
|
||||||
addr = socket.gethostbyname(hostname)
|
addr = self._gethostbyname(hostname)
|
||||||
return self.time_zone_by_addr(addr)
|
return self.time_zone_by_addr(addr)
|
||||||
|
|
|
@ -1,382 +1,431 @@
|
||||||
"""
|
# -*- coding: utf-8 -*-
|
||||||
Constants needed for parsing binary GeoIP databases. It is part of the pygeoip
|
"""
|
||||||
package.
|
Constants needed for the binary parser. Part of the pygeoip package.
|
||||||
|
|
||||||
@author: Jennifer Ennis <zaylea at gmail dot com>
|
@author: Jennifer Ennis <zaylea@gmail.com>
|
||||||
|
|
||||||
@license:
|
@license: Copyright(C) 2004 MaxMind LLC
|
||||||
Copyright(C) 2004 MaxMind LLC
|
|
||||||
|
This program is free software: you can redistribute it and/or modify
|
||||||
This program is free software: you can redistribute it and/or modify
|
it under the terms of the GNU Lesser General Public License as published by
|
||||||
it under the terms of the GNU Lesser General Public License as published by
|
the Free Software Foundation, either version 3 of the License, or
|
||||||
the Free Software Foundation, either version 3 of the License, or
|
(at your option) any later version.
|
||||||
(at your option) any later version.
|
|
||||||
|
This program is distributed in the hope that it will be useful,
|
||||||
This program is distributed in the hope that it will be useful,
|
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
GNU General Public License for more details.
|
||||||
GNU General Public License for more details.
|
|
||||||
|
You should have received a copy of the GNU Lesser General Public License
|
||||||
You should have received a copy of the GNU Lesser General Public License
|
along with this program. If not, see <http://www.gnu.org/licenses/lgpl.txt>.
|
||||||
along with this program. If not, see <http://www.gnu.org/licenses/lgpl.txt>.
|
"""
|
||||||
"""
|
|
||||||
|
from platform import python_version_tuple
|
||||||
GEOIP_STANDARD = 0
|
|
||||||
GEOIP_MEMORY_CACHE = 1
|
PY2 = python_version_tuple()[0] == '2'
|
||||||
|
PY3 = python_version_tuple()[0] == '3'
|
||||||
DMA_MAP = {
|
|
||||||
500 : 'Portland-Auburn, ME',
|
GEOIP_STANDARD = 0
|
||||||
501 : 'New York, NY',
|
GEOIP_MEMORY_CACHE = 1
|
||||||
502 : 'Binghamton, NY',
|
|
||||||
503 : 'Macon, GA',
|
DMA_MAP = {
|
||||||
504 : 'Philadelphia, PA',
|
500: 'Portland-Auburn, ME',
|
||||||
505 : 'Detroit, MI',
|
501: 'New York, NY',
|
||||||
506 : 'Boston, MA',
|
502: 'Binghamton, NY',
|
||||||
507 : 'Savannah, GA',
|
503: 'Macon, GA',
|
||||||
508 : 'Pittsburgh, PA',
|
504: 'Philadelphia, PA',
|
||||||
509 : 'Ft Wayne, IN',
|
505: 'Detroit, MI',
|
||||||
510 : 'Cleveland, OH',
|
506: 'Boston, MA',
|
||||||
511 : 'Washington, DC',
|
507: 'Savannah, GA',
|
||||||
512 : 'Baltimore, MD',
|
508: 'Pittsburgh, PA',
|
||||||
513 : 'Flint, MI',
|
509: 'Ft Wayne, IN',
|
||||||
514 : 'Buffalo, NY',
|
510: 'Cleveland, OH',
|
||||||
515 : 'Cincinnati, OH',
|
511: 'Washington, DC',
|
||||||
516 : 'Erie, PA',
|
512: 'Baltimore, MD',
|
||||||
517 : 'Charlotte, NC',
|
513: 'Flint, MI',
|
||||||
518 : 'Greensboro, NC',
|
514: 'Buffalo, NY',
|
||||||
519 : 'Charleston, SC',
|
515: 'Cincinnati, OH',
|
||||||
520 : 'Augusta, GA',
|
516: 'Erie, PA',
|
||||||
521 : 'Providence, RI',
|
517: 'Charlotte, NC',
|
||||||
522 : 'Columbus, GA',
|
518: 'Greensboro, NC',
|
||||||
523 : 'Burlington, VT',
|
519: 'Charleston, SC',
|
||||||
524 : 'Atlanta, GA',
|
520: 'Augusta, GA',
|
||||||
525 : 'Albany, GA',
|
521: 'Providence, RI',
|
||||||
526 : 'Utica-Rome, NY',
|
522: 'Columbus, GA',
|
||||||
527 : 'Indianapolis, IN',
|
523: 'Burlington, VT',
|
||||||
528 : 'Miami, FL',
|
524: 'Atlanta, GA',
|
||||||
529 : 'Louisville, KY',
|
525: 'Albany, GA',
|
||||||
530 : 'Tallahassee, FL',
|
526: 'Utica-Rome, NY',
|
||||||
531 : 'Tri-Cities, TN',
|
527: 'Indianapolis, IN',
|
||||||
532 : 'Albany-Schenectady-Troy, NY',
|
528: 'Miami, FL',
|
||||||
533 : 'Hartford, CT',
|
529: 'Louisville, KY',
|
||||||
534 : 'Orlando, FL',
|
530: 'Tallahassee, FL',
|
||||||
535 : 'Columbus, OH',
|
531: 'Tri-Cities, TN',
|
||||||
536 : 'Youngstown-Warren, OH',
|
532: 'Albany-Schenectady-Troy, NY',
|
||||||
537 : 'Bangor, ME',
|
533: 'Hartford, CT',
|
||||||
538 : 'Rochester, NY',
|
534: 'Orlando, FL',
|
||||||
539 : 'Tampa, FL',
|
535: 'Columbus, OH',
|
||||||
540 : 'Traverse City-Cadillac, MI',
|
536: 'Youngstown-Warren, OH',
|
||||||
541 : 'Lexington, KY',
|
537: 'Bangor, ME',
|
||||||
542 : 'Dayton, OH',
|
538: 'Rochester, NY',
|
||||||
543 : 'Springfield-Holyoke, MA',
|
539: 'Tampa, FL',
|
||||||
544 : 'Norfolk-Portsmouth, VA',
|
540: 'Traverse City-Cadillac, MI',
|
||||||
545 : 'Greenville-New Bern-Washington, NC',
|
541: 'Lexington, KY',
|
||||||
546 : 'Columbia, SC',
|
542: 'Dayton, OH',
|
||||||
547 : 'Toledo, OH',
|
543: 'Springfield-Holyoke, MA',
|
||||||
548 : 'West Palm Beach, FL',
|
544: 'Norfolk-Portsmouth, VA',
|
||||||
549 : 'Watertown, NY',
|
545: 'Greenville-New Bern-Washington, NC',
|
||||||
550 : 'Wilmington, NC',
|
546: 'Columbia, SC',
|
||||||
551 : 'Lansing, MI',
|
547: 'Toledo, OH',
|
||||||
552 : 'Presque Isle, ME',
|
548: 'West Palm Beach, FL',
|
||||||
553 : 'Marquette, MI',
|
549: 'Watertown, NY',
|
||||||
554 : 'Wheeling, WV',
|
550: 'Wilmington, NC',
|
||||||
555 : 'Syracuse, NY',
|
551: 'Lansing, MI',
|
||||||
556 : 'Richmond-Petersburg, VA',
|
552: 'Presque Isle, ME',
|
||||||
557 : 'Knoxville, TN',
|
553: 'Marquette, MI',
|
||||||
558 : 'Lima, OH',
|
554: 'Wheeling, WV',
|
||||||
559 : 'Bluefield-Beckley-Oak Hill, WV',
|
555: 'Syracuse, NY',
|
||||||
560 : 'Raleigh-Durham, NC',
|
556: 'Richmond-Petersburg, VA',
|
||||||
561 : 'Jacksonville, FL',
|
557: 'Knoxville, TN',
|
||||||
563 : 'Grand Rapids, MI',
|
558: 'Lima, OH',
|
||||||
564 : 'Charleston-Huntington, WV',
|
559: 'Bluefield-Beckley-Oak Hill, WV',
|
||||||
565 : 'Elmira, NY',
|
560: 'Raleigh-Durham, NC',
|
||||||
566 : 'Harrisburg-Lancaster-Lebanon-York, PA',
|
561: 'Jacksonville, FL',
|
||||||
567 : 'Greenville-Spartenburg, SC',
|
563: 'Grand Rapids, MI',
|
||||||
569 : 'Harrisonburg, VA',
|
564: 'Charleston-Huntington, WV',
|
||||||
570 : 'Florence-Myrtle Beach, SC',
|
565: 'Elmira, NY',
|
||||||
571 : 'Ft Myers, FL',
|
566: 'Harrisburg-Lancaster-Lebanon-York, PA',
|
||||||
573 : 'Roanoke-Lynchburg, VA',
|
567: 'Greenville-Spartenburg, SC',
|
||||||
574 : 'Johnstown-Altoona, PA',
|
569: 'Harrisonburg, VA',
|
||||||
575 : 'Chattanooga, TN',
|
570: 'Florence-Myrtle Beach, SC',
|
||||||
576 : 'Salisbury, MD',
|
571: 'Ft Myers, FL',
|
||||||
577 : 'Wilkes Barre-Scranton, PA',
|
573: 'Roanoke-Lynchburg, VA',
|
||||||
581 : 'Terre Haute, IN',
|
574: 'Johnstown-Altoona, PA',
|
||||||
582 : 'Lafayette, IN',
|
575: 'Chattanooga, TN',
|
||||||
583 : 'Alpena, MI',
|
576: 'Salisbury, MD',
|
||||||
584 : 'Charlottesville, VA',
|
577: 'Wilkes Barre-Scranton, PA',
|
||||||
588 : 'South Bend, IN',
|
581: 'Terre Haute, IN',
|
||||||
592 : 'Gainesville, FL',
|
582: 'Lafayette, IN',
|
||||||
596 : 'Zanesville, OH',
|
583: 'Alpena, MI',
|
||||||
597 : 'Parkersburg, WV',
|
584: 'Charlottesville, VA',
|
||||||
598 : 'Clarksburg-Weston, WV',
|
588: 'South Bend, IN',
|
||||||
600 : 'Corpus Christi, TX',
|
592: 'Gainesville, FL',
|
||||||
602 : 'Chicago, IL',
|
596: 'Zanesville, OH',
|
||||||
603 : 'Joplin-Pittsburg, MO',
|
597: 'Parkersburg, WV',
|
||||||
604 : 'Columbia-Jefferson City, MO',
|
598: 'Clarksburg-Weston, WV',
|
||||||
605 : 'Topeka, KS',
|
600: 'Corpus Christi, TX',
|
||||||
606 : 'Dothan, AL',
|
602: 'Chicago, IL',
|
||||||
609 : 'St Louis, MO',
|
603: 'Joplin-Pittsburg, MO',
|
||||||
610 : 'Rockford, IL',
|
604: 'Columbia-Jefferson City, MO',
|
||||||
611 : 'Rochester-Mason City-Austin, MN',
|
605: 'Topeka, KS',
|
||||||
612 : 'Shreveport, LA',
|
606: 'Dothan, AL',
|
||||||
613 : 'Minneapolis-St Paul, MN',
|
609: 'St Louis, MO',
|
||||||
616 : 'Kansas City, MO',
|
610: 'Rockford, IL',
|
||||||
617 : 'Milwaukee, WI',
|
611: 'Rochester-Mason City-Austin, MN',
|
||||||
618 : 'Houston, TX',
|
612: 'Shreveport, LA',
|
||||||
619 : 'Springfield, MO',
|
613: 'Minneapolis-St Paul, MN',
|
||||||
620 : 'Tuscaloosa, AL',
|
616: 'Kansas City, MO',
|
||||||
622 : 'New Orleans, LA',
|
617: 'Milwaukee, WI',
|
||||||
623 : 'Dallas-Fort Worth, TX',
|
618: 'Houston, TX',
|
||||||
624 : 'Sioux City, IA',
|
619: 'Springfield, MO',
|
||||||
625 : 'Waco-Temple-Bryan, TX',
|
620: 'Tuscaloosa, AL',
|
||||||
626 : 'Victoria, TX',
|
622: 'New Orleans, LA',
|
||||||
627 : 'Wichita Falls, TX',
|
623: 'Dallas-Fort Worth, TX',
|
||||||
628 : 'Monroe, LA',
|
624: 'Sioux City, IA',
|
||||||
630 : 'Birmingham, AL',
|
625: 'Waco-Temple-Bryan, TX',
|
||||||
631 : 'Ottumwa-Kirksville, IA',
|
626: 'Victoria, TX',
|
||||||
632 : 'Paducah, KY',
|
627: 'Wichita Falls, TX',
|
||||||
633 : 'Odessa-Midland, TX',
|
628: 'Monroe, LA',
|
||||||
634 : 'Amarillo, TX',
|
630: 'Birmingham, AL',
|
||||||
635 : 'Austin, TX',
|
631: 'Ottumwa-Kirksville, IA',
|
||||||
636 : 'Harlingen, TX',
|
632: 'Paducah, KY',
|
||||||
637 : 'Cedar Rapids-Waterloo, IA',
|
633: 'Odessa-Midland, TX',
|
||||||
638 : 'St Joseph, MO',
|
634: 'Amarillo, TX',
|
||||||
639 : 'Jackson, TN',
|
635: 'Austin, TX',
|
||||||
640 : 'Memphis, TN',
|
636: 'Harlingen, TX',
|
||||||
641 : 'San Antonio, TX',
|
637: 'Cedar Rapids-Waterloo, IA',
|
||||||
642 : 'Lafayette, LA',
|
638: 'St Joseph, MO',
|
||||||
643 : 'Lake Charles, LA',
|
639: 'Jackson, TN',
|
||||||
644 : 'Alexandria, LA',
|
640: 'Memphis, TN',
|
||||||
646 : 'Anniston, AL',
|
641: 'San Antonio, TX',
|
||||||
647 : 'Greenwood-Greenville, MS',
|
642: 'Lafayette, LA',
|
||||||
648 : 'Champaign-Springfield-Decatur, IL',
|
643: 'Lake Charles, LA',
|
||||||
649 : 'Evansville, IN',
|
644: 'Alexandria, LA',
|
||||||
650 : 'Oklahoma City, OK',
|
646: 'Anniston, AL',
|
||||||
651 : 'Lubbock, TX',
|
647: 'Greenwood-Greenville, MS',
|
||||||
652 : 'Omaha, NE',
|
648: 'Champaign-Springfield-Decatur, IL',
|
||||||
656 : 'Panama City, FL',
|
649: 'Evansville, IN',
|
||||||
657 : 'Sherman, TX',
|
650: 'Oklahoma City, OK',
|
||||||
658 : 'Green Bay-Appleton, WI',
|
651: 'Lubbock, TX',
|
||||||
659 : 'Nashville, TN',
|
652: 'Omaha, NE',
|
||||||
661 : 'San Angelo, TX',
|
656: 'Panama City, FL',
|
||||||
662 : 'Abilene-Sweetwater, TX',
|
657: 'Sherman, TX',
|
||||||
669 : 'Madison, WI',
|
658: 'Green Bay-Appleton, WI',
|
||||||
670 : 'Ft Smith-Fay-Springfield, AR',
|
659: 'Nashville, TN',
|
||||||
671 : 'Tulsa, OK',
|
661: 'San Angelo, TX',
|
||||||
673 : 'Columbus-Tupelo-West Point, MS',
|
662: 'Abilene-Sweetwater, TX',
|
||||||
675 : 'Peoria-Bloomington, IL',
|
669: 'Madison, WI',
|
||||||
676 : 'Duluth, MN',
|
670: 'Ft Smith-Fay-Springfield, AR',
|
||||||
678 : 'Wichita, KS',
|
671: 'Tulsa, OK',
|
||||||
679 : 'Des Moines, IA',
|
673: 'Columbus-Tupelo-West Point, MS',
|
||||||
682 : 'Davenport-Rock Island-Moline, IL',
|
675: 'Peoria-Bloomington, IL',
|
||||||
686 : 'Mobile, AL',
|
676: 'Duluth, MN',
|
||||||
687 : 'Minot-Bismarck-Dickinson, ND',
|
678: 'Wichita, KS',
|
||||||
691 : 'Huntsville, AL',
|
679: 'Des Moines, IA',
|
||||||
692 : 'Beaumont-Port Author, TX',
|
682: 'Davenport-Rock Island-Moline, IL',
|
||||||
693 : 'Little Rock-Pine Bluff, AR',
|
686: 'Mobile, AL',
|
||||||
698 : 'Montgomery, AL',
|
687: 'Minot-Bismarck-Dickinson, ND',
|
||||||
702 : 'La Crosse-Eau Claire, WI',
|
691: 'Huntsville, AL',
|
||||||
705 : 'Wausau-Rhinelander, WI',
|
692: 'Beaumont-Port Author, TX',
|
||||||
709 : 'Tyler-Longview, TX',
|
693: 'Little Rock-Pine Bluff, AR',
|
||||||
710 : 'Hattiesburg-Laurel, MS',
|
698: 'Montgomery, AL',
|
||||||
711 : 'Meridian, MS',
|
702: 'La Crosse-Eau Claire, WI',
|
||||||
716 : 'Baton Rouge, LA',
|
705: 'Wausau-Rhinelander, WI',
|
||||||
717 : 'Quincy, IL',
|
709: 'Tyler-Longview, TX',
|
||||||
718 : 'Jackson, MS',
|
710: 'Hattiesburg-Laurel, MS',
|
||||||
722 : 'Lincoln-Hastings, NE',
|
711: 'Meridian, MS',
|
||||||
724 : 'Fargo-Valley City, ND',
|
716: 'Baton Rouge, LA',
|
||||||
725 : 'Sioux Falls, SD',
|
717: 'Quincy, IL',
|
||||||
734 : 'Jonesboro, AR',
|
718: 'Jackson, MS',
|
||||||
736 : 'Bowling Green, KY',
|
722: 'Lincoln-Hastings, NE',
|
||||||
737 : 'Mankato, MN',
|
724: 'Fargo-Valley City, ND',
|
||||||
740 : 'North Platte, NE',
|
725: 'Sioux Falls, SD',
|
||||||
743 : 'Anchorage, AK',
|
734: 'Jonesboro, AR',
|
||||||
744 : 'Honolulu, HI',
|
736: 'Bowling Green, KY',
|
||||||
745 : 'Fairbanks, AK',
|
737: 'Mankato, MN',
|
||||||
746 : 'Biloxi-Gulfport, MS',
|
740: 'North Platte, NE',
|
||||||
747 : 'Juneau, AK',
|
743: 'Anchorage, AK',
|
||||||
749 : 'Laredo, TX',
|
744: 'Honolulu, HI',
|
||||||
751 : 'Denver, CO',
|
745: 'Fairbanks, AK',
|
||||||
752 : 'Colorado Springs, CO',
|
746: 'Biloxi-Gulfport, MS',
|
||||||
753 : 'Phoenix, AZ',
|
747: 'Juneau, AK',
|
||||||
754 : 'Butte-Bozeman, MT',
|
749: 'Laredo, TX',
|
||||||
755 : 'Great Falls, MT',
|
751: 'Denver, CO',
|
||||||
756 : 'Billings, MT',
|
752: 'Colorado Springs, CO',
|
||||||
757 : 'Boise, ID',
|
753: 'Phoenix, AZ',
|
||||||
758 : 'Idaho Falls-Pocatello, ID',
|
754: 'Butte-Bozeman, MT',
|
||||||
759 : 'Cheyenne, WY',
|
755: 'Great Falls, MT',
|
||||||
760 : 'Twin Falls, ID',
|
756: 'Billings, MT',
|
||||||
762 : 'Missoula, MT',
|
757: 'Boise, ID',
|
||||||
764 : 'Rapid City, SD',
|
758: 'Idaho Falls-Pocatello, ID',
|
||||||
765 : 'El Paso, TX',
|
759: 'Cheyenne, WY',
|
||||||
766 : 'Helena, MT',
|
760: 'Twin Falls, ID',
|
||||||
767 : 'Casper-Riverton, WY',
|
762: 'Missoula, MT',
|
||||||
770 : 'Salt Lake City, UT',
|
764: 'Rapid City, SD',
|
||||||
771 : 'Yuma, AZ',
|
765: 'El Paso, TX',
|
||||||
773 : 'Grand Junction, CO',
|
766: 'Helena, MT',
|
||||||
789 : 'Tucson, AZ',
|
767: 'Casper-Riverton, WY',
|
||||||
790 : 'Albuquerque, NM',
|
770: 'Salt Lake City, UT',
|
||||||
798 : 'Glendive, MT',
|
771: 'Yuma, AZ',
|
||||||
800 : 'Bakersfield, CA',
|
773: 'Grand Junction, CO',
|
||||||
801 : 'Eugene, OR',
|
789: 'Tucson, AZ',
|
||||||
802 : 'Eureka, CA',
|
790: 'Albuquerque, NM',
|
||||||
803 : 'Los Angeles, CA',
|
798: 'Glendive, MT',
|
||||||
804 : 'Palm Springs, CA',
|
800: 'Bakersfield, CA',
|
||||||
807 : 'San Francisco, CA',
|
801: 'Eugene, OR',
|
||||||
810 : 'Yakima-Pasco, WA',
|
802: 'Eureka, CA',
|
||||||
811 : 'Reno, NV',
|
803: 'Los Angeles, CA',
|
||||||
813 : 'Medford-Klamath Falls, OR',
|
804: 'Palm Springs, CA',
|
||||||
819 : 'Seattle-Tacoma, WA',
|
807: 'San Francisco, CA',
|
||||||
820 : 'Portland, OR',
|
810: 'Yakima-Pasco, WA',
|
||||||
821 : 'Bend, OR',
|
811: 'Reno, NV',
|
||||||
825 : 'San Diego, CA',
|
813: 'Medford-Klamath Falls, OR',
|
||||||
828 : 'Monterey-Salinas, CA',
|
819: 'Seattle-Tacoma, WA',
|
||||||
839 : 'Las Vegas, NV',
|
820: 'Portland, OR',
|
||||||
855 : 'Santa Barbara, CA',
|
821: 'Bend, OR',
|
||||||
862 : 'Sacramento, CA',
|
825: 'San Diego, CA',
|
||||||
866 : 'Fresno, CA',
|
828: 'Monterey-Salinas, CA',
|
||||||
868 : 'Chico-Redding, CA',
|
839: 'Las Vegas, NV',
|
||||||
881 : 'Spokane, WA'
|
855: 'Santa Barbara, CA',
|
||||||
}
|
862: 'Sacramento, CA',
|
||||||
|
866: 'Fresno, CA',
|
||||||
COUNTRY_CODES = (
|
868: 'Chico-Redding, CA',
|
||||||
'', 'AP', 'EU', 'AD', 'AE', 'AF', 'AG', 'AI', 'AL', 'AM', 'AN', 'AO', 'AQ',
|
881: 'Spokane, WA'
|
||||||
'AR', 'AS', 'AT', 'AU', 'AW', 'AZ', 'BA', 'BB', 'BD', 'BE', 'BF', 'BG', 'BH',
|
}
|
||||||
'BI', 'BJ', 'BM', 'BN', 'BO', 'BR', 'BS', 'BT', 'BV', 'BW', 'BY', 'BZ', 'CA',
|
|
||||||
'CC', 'CD', 'CF', 'CG', 'CH', 'CI', 'CK', 'CL', 'CM', 'CN', 'CO', 'CR', 'CU',
|
COUNTRY_CODES = (
|
||||||
'CV', 'CX', 'CY', 'CZ', 'DE', 'DJ', 'DK', 'DM', 'DO', 'DZ', 'EC', 'EE', 'EG',
|
'',
|
||||||
'EH', 'ER', 'ES', 'ET', 'FI', 'FJ', 'FK', 'FM', 'FO', 'FR', 'FX', 'GA', 'GB',
|
'AP', 'EU', 'AD', 'AE', 'AF', 'AG', 'AI', 'AL', 'AM', 'AN', 'AO', 'AQ',
|
||||||
'GD', 'GE', 'GF', 'GH', 'GI', 'GL', 'GM', 'GN', 'GP', 'GQ', 'GR', 'GS', 'GT',
|
'AR', 'AS', 'AT', 'AU', 'AW', 'AZ', 'BA', 'BB', 'BD', 'BE', 'BF', 'BG',
|
||||||
'GU', 'GW', 'GY', 'HK', 'HM', 'HN', 'HR', 'HT', 'HU', 'ID', 'IE', 'IL', 'IN',
|
'BH', 'BI', 'BJ', 'BM', 'BN', 'BO', 'BR', 'BS', 'BT', 'BV', 'BW', 'BY',
|
||||||
'IO', 'IQ', 'IR', 'IS', 'IT', 'JM', 'JO', 'JP', 'KE', 'KG', 'KH', 'KI', 'KM',
|
'BZ', 'CA', 'CC', 'CD', 'CF', 'CG', 'CH', 'CI', 'CK', 'CL', 'CM', 'CN',
|
||||||
'KN', 'KP', 'KR', 'KW', 'KY', 'KZ', 'LA', 'LB', 'LC', 'LI', 'LK', 'LR', 'LS',
|
'CO', 'CR', 'CU', 'CV', 'CX', 'CY', 'CZ', 'DE', 'DJ', 'DK', 'DM', 'DO',
|
||||||
'LT', 'LU', 'LV', 'LY', 'MA', 'MC', 'MD', 'MG', 'MH', 'MK', 'ML', 'MM', 'MN',
|
'DZ', 'EC', 'EE', 'EG', 'EH', 'ER', 'ES', 'ET', 'FI', 'FJ', 'FK', 'FM',
|
||||||
'MO', 'MP', 'MQ', 'MR', 'MS', 'MT', 'MU', 'MV', 'MW', 'MX', 'MY', 'MZ', 'NA',
|
'FO', 'FR', 'FX', 'GA', 'GB', 'GD', 'GE', 'GF', 'GH', 'GI', 'GL', 'GM',
|
||||||
'NC', 'NE', 'NF', 'NG', 'NI', 'NL', 'NO', 'NP', 'NR', 'NU', 'NZ', 'OM', 'PA',
|
'GN', 'GP', 'GQ', 'GR', 'GS', 'GT', 'GU', 'GW', 'GY', 'HK', 'HM', 'HN',
|
||||||
'PE', 'PF', 'PG', 'PH', 'PK', 'PL', 'PM', 'PN', 'PR', 'PS', 'PT', 'PW', 'PY',
|
'HR', 'HT', 'HU', 'ID', 'IE', 'IL', 'IN', 'IO', 'IQ', 'IR', 'IS', 'IT',
|
||||||
'QA', 'RE', 'RO', 'RU', 'RW', 'SA', 'SB', 'SC', 'SD', 'SE', 'SG', 'SH', 'SI',
|
'JM', 'JO', 'JP', 'KE', 'KG', 'KH', 'KI', 'KM', 'KN', 'KP', 'KR', 'KW',
|
||||||
'SJ', 'SK', 'SL', 'SM', 'SN', 'SO', 'SR', 'ST', 'SV', 'SY', 'SZ', 'TC', 'TD',
|
'KY', 'KZ', 'LA', 'LB', 'LC', 'LI', 'LK', 'LR', 'LS', 'LT', 'LU', 'LV',
|
||||||
'TF', 'TG', 'TH', 'TJ', 'TK', 'TM', 'TN', 'TO', 'TL', 'TR', 'TT', 'TV', 'TW',
|
'LY', 'MA', 'MC', 'MD', 'MG', 'MH', 'MK', 'ML', 'MM', 'MN', 'MO', 'MP',
|
||||||
'TZ', 'UA', 'UG', 'UM', 'US', 'UY', 'UZ', 'VA', 'VC', 'VE', 'VG', 'VI', 'VN',
|
'MQ', 'MR', 'MS', 'MT', 'MU', 'MV', 'MW', 'MX', 'MY', 'MZ', 'NA', 'NC',
|
||||||
'VU', 'WF', 'WS', 'YE', 'YT', 'RS', 'ZA', 'ZM', 'ME', 'ZW', 'A1', 'A2', 'O1',
|
'NE', 'NF', 'NG', 'NI', 'NL', 'NO', 'NP', 'NR', 'NU', 'NZ', 'OM', 'PA',
|
||||||
'AX', 'GG', 'IM', 'JE', 'BL', 'MF'
|
'PE', 'PF', 'PG', 'PH', 'PK', 'PL', 'PM', 'PN', 'PR', 'PS', 'PT', 'PW',
|
||||||
)
|
'PY', 'QA', 'RE', 'RO', 'RU', 'RW', 'SA', 'SB', 'SC', 'SD', 'SE', 'SG',
|
||||||
|
'SH', 'SI', 'SJ', 'SK', 'SL', 'SM', 'SN', 'SO', 'SR', 'ST', 'SV', 'SY',
|
||||||
COUNTRY_CODES3 = (
|
'SZ', 'TC', 'TD', 'TF', 'TG', 'TH', 'TJ', 'TK', 'TM', 'TN', 'TO', 'TL',
|
||||||
'','AP','EU','AND','ARE','AFG','ATG','AIA','ALB','ARM','ANT','AGO','AQ','ARG',
|
'TR', 'TT', 'TV', 'TW', 'TZ', 'UA', 'UG', 'UM', 'US', 'UY', 'UZ', 'VA',
|
||||||
'ASM','AUT','AUS','ABW','AZE','BIH','BRB','BGD','BEL','BFA','BGR','BHR','BDI',
|
'VC', 'VE', 'VG', 'VI', 'VN', 'VU', 'WF', 'WS', 'YE', 'YT', 'RS', 'ZA',
|
||||||
'BEN','BMU','BRN','BOL','BRA','BHS','BTN','BV','BWA','BLR','BLZ','CAN','CC',
|
'ZM', 'ME', 'ZW', 'A1', 'A2', 'O1', 'AX', 'GG', 'IM', 'JE', 'BL', 'MF',
|
||||||
'COD','CAF','COG','CHE','CIV','COK','CHL','CMR','CHN','COL','CRI','CUB','CPV',
|
'BQ', 'SS'
|
||||||
'CX','CYP','CZE','DEU','DJI','DNK','DMA','DOM','DZA','ECU','EST','EGY','ESH',
|
)
|
||||||
'ERI','ESP','ETH','FIN','FJI','FLK','FSM','FRO','FRA','FX','GAB','GBR','GRD',
|
|
||||||
'GEO','GUF','GHA','GIB','GRL','GMB','GIN','GLP','GNQ','GRC','GS','GTM','GUM',
|
COUNTRY_CODES3 = (
|
||||||
'GNB','GUY','HKG','HM','HND','HRV','HTI','HUN','IDN','IRL','ISR','IND','IO',
|
'', 'AP', 'EU', 'AND', 'ARE', 'AFG', 'ATG', 'AIA', 'ALB', 'ARM', 'ANT',
|
||||||
'IRQ','IRN','ISL','ITA','JAM','JOR','JPN','KEN','KGZ','KHM','KIR','COM','KNA',
|
'AGO', 'AQ', 'ARG', 'ASM', 'AUT', 'AUS', 'ABW', 'AZE', 'BIH', 'BRB', 'BGD',
|
||||||
'PRK','KOR','KWT','CYM','KAZ','LAO','LBN','LCA','LIE','LKA','LBR','LSO','LTU',
|
'BEL', 'BFA', 'BGR', 'BHR', 'BDI', 'BEN', 'BMU', 'BRN', 'BOL', 'BRA',
|
||||||
'LUX','LVA','LBY','MAR','MCO','MDA','MDG','MHL','MKD','MLI','MMR','MNG','MAC',
|
'BHS', 'BTN', 'BV', 'BWA', 'BLR', 'BLZ', 'CAN', 'CC', 'COD', 'CAF', 'COG',
|
||||||
'MNP','MTQ','MRT','MSR','MLT','MUS','MDV','MWI','MEX','MYS','MOZ','NAM','NCL',
|
'CHE', 'CIV', 'COK', 'CHL', 'CMR', 'CHN', 'COL', 'CRI', 'CUB', 'CPV', 'CX',
|
||||||
'NER','NFK','NGA','NIC','NLD','NOR','NPL','NRU','NIU','NZL','OMN','PAN','PER',
|
'CYP', 'CZE', 'DEU', 'DJI', 'DNK', 'DMA', 'DOM', 'DZA', 'ECU', 'EST',
|
||||||
'PYF','PNG','PHL','PAK','POL','SPM','PCN','PRI','PSE','PRT','PLW','PRY','QAT',
|
'EGY', 'ESH', 'ERI', 'ESP', 'ETH', 'FIN', 'FJI', 'FLK', 'FSM', 'FRO',
|
||||||
'REU','ROU','RUS','RWA','SAU','SLB','SYC','SDN','SWE','SGP','SHN','SVN','SJM',
|
'FRA', 'FX', 'GAB', 'GBR', 'GRD', 'GEO', 'GUF', 'GHA', 'GIB', 'GRL', 'GMB',
|
||||||
'SVK','SLE','SMR','SEN','SOM','SUR','STP','SLV','SYR','SWZ','TCA','TCD','TF',
|
'GIN', 'GLP', 'GNQ', 'GRC', 'GS', 'GTM', 'GUM', 'GNB', 'GUY', 'HKG', 'HM',
|
||||||
'TGO','THA','TJK','TKL','TLS','TKM','TUN','TON','TUR','TTO','TUV','TWN','TZA',
|
'HND', 'HRV', 'HTI', 'HUN', 'IDN', 'IRL', 'ISR', 'IND', 'IO', 'IRQ', 'IRN',
|
||||||
'UKR','UGA','UM','USA','URY','UZB','VAT','VCT','VEN','VGB','VIR','VNM','VUT',
|
'ISL', 'ITA', 'JAM', 'JOR', 'JPN', 'KEN', 'KGZ', 'KHM', 'KIR', 'COM',
|
||||||
'WLF','WSM','YEM','YT','SRB','ZAF','ZMB','MNE','ZWE','A1','A2','O1',
|
'KNA', 'PRK', 'KOR', 'KWT', 'CYM', 'KAZ', 'LAO', 'LBN', 'LCA', 'LIE',
|
||||||
'ALA','GGY','IMN','JEY','BLM','MAF'
|
'LKA', 'LBR', 'LSO', 'LTU', 'LUX', 'LVA', 'LBY', 'MAR', 'MCO', 'MDA',
|
||||||
)
|
'MDG', 'MHL', 'MKD', 'MLI', 'MMR', 'MNG', 'MAC', 'MNP', 'MTQ', 'MRT',
|
||||||
|
'MSR', 'MLT', 'MUS', 'MDV', 'MWI', 'MEX', 'MYS', 'MOZ', 'NAM', 'NCL',
|
||||||
COUNTRY_NAMES = (
|
'NER', 'NFK', 'NGA', 'NIC', 'NLD', 'NOR', 'NPL', 'NRU', 'NIU', 'NZL',
|
||||||
"", "Asia/Pacific Region", "Europe", "Andorra", "United Arab Emirates",
|
'OMN', 'PAN', 'PER', 'PYF', 'PNG', 'PHL', 'PAK', 'POL', 'SPM', 'PCN',
|
||||||
"Afghanistan", "Antigua and Barbuda", "Anguilla", "Albania", "Armenia",
|
'PRI', 'PSE', 'PRT', 'PLW', 'PRY', 'QAT', 'REU', 'ROU', 'RUS', 'RWA',
|
||||||
"Netherlands Antilles", "Angola", "Antarctica", "Argentina", "American Samoa",
|
'SAU', 'SLB', 'SYC', 'SDN', 'SWE', 'SGP', 'SHN', 'SVN', 'SJM', 'SVK',
|
||||||
"Austria", "Australia", "Aruba", "Azerbaijan", "Bosnia and Herzegovina",
|
'SLE', 'SMR', 'SEN', 'SOM', 'SUR', 'STP', 'SLV', 'SYR', 'SWZ', 'TCA',
|
||||||
"Barbados", "Bangladesh", "Belgium", "Burkina Faso", "Bulgaria", "Bahrain",
|
'TCD', 'TF', 'TGO', 'THA', 'TJK', 'TKL', 'TLS', 'TKM', 'TUN', 'TON', 'TUR',
|
||||||
"Burundi", "Benin", "Bermuda", "Brunei Darussalam", "Bolivia", "Brazil",
|
'TTO', 'TUV', 'TWN', 'TZA', 'UKR', 'UGA', 'UM', 'USA', 'URY', 'UZB', 'VAT',
|
||||||
"Bahamas", "Bhutan", "Bouvet Island", "Botswana", "Belarus", "Belize",
|
'VCT', 'VEN', 'VGB', 'VIR', 'VNM', 'VUT', 'WLF', 'WSM', 'YEM', 'YT', 'SRB',
|
||||||
"Canada", "Cocos (Keeling) Islands", "Congo, The Democratic Republic of the",
|
'ZAF', 'ZMB', 'MNE', 'ZWE', 'A1', 'A2', 'O1', 'ALA', 'GGY', 'IMN', 'JEY',
|
||||||
"Central African Republic", "Congo", "Switzerland", "Cote D'Ivoire", "Cook Islands",
|
'BLM', 'MAF', 'BES', 'SSD'
|
||||||
"Chile", "Cameroon", "China", "Colombia", "Costa Rica", "Cuba", "Cape Verde",
|
)
|
||||||
"Christmas Island", "Cyprus", "Czech Republic", "Germany", "Djibouti",
|
|
||||||
"Denmark", "Dominica", "Dominican Republic", "Algeria", "Ecuador", "Estonia",
|
COUNTRY_NAMES = (
|
||||||
"Egypt", "Western Sahara", "Eritrea", "Spain", "Ethiopia", "Finland", "Fiji",
|
'', 'Asia/Pacific Region', 'Europe', 'Andorra', 'United Arab Emirates',
|
||||||
"Falkland Islands (Malvinas)", "Micronesia, Federated States of", "Faroe Islands",
|
'Afghanistan', 'Antigua and Barbuda', 'Anguilla', 'Albania', 'Armenia',
|
||||||
"France", "France, Metropolitan", "Gabon", "United Kingdom",
|
'Netherlands Antilles', 'Angola', 'Antarctica', 'Argentina',
|
||||||
"Grenada", "Georgia", "French Guiana", "Ghana", "Gibraltar", "Greenland",
|
'American Samoa', 'Austria', 'Australia', 'Aruba', 'Azerbaijan',
|
||||||
"Gambia", "Guinea", "Guadeloupe", "Equatorial Guinea", "Greece",
|
'Bosnia and Herzegovina', 'Barbados', 'Bangladesh', 'Belgium',
|
||||||
"South Georgia and the South Sandwich Islands",
|
'Burkina Faso', 'Bulgaria', 'Bahrain', 'Burundi', 'Benin', 'Bermuda',
|
||||||
"Guatemala", "Guam", "Guinea-Bissau",
|
'Brunei Darussalam', 'Bolivia', 'Brazil', 'Bahamas', 'Bhutan',
|
||||||
"Guyana", "Hong Kong", "Heard Island and McDonald Islands", "Honduras",
|
'Bouvet Island', 'Botswana', 'Belarus', 'Belize', 'Canada',
|
||||||
"Croatia", "Haiti", "Hungary", "Indonesia", "Ireland", "Israel", "India",
|
'Cocos (Keeling) Islands', 'Congo, The Democratic Republic of the',
|
||||||
"British Indian Ocean Territory", "Iraq", "Iran, Islamic Republic of",
|
'Central African Republic', 'Congo', 'Switzerland', 'Cote D\'Ivoire',
|
||||||
"Iceland", "Italy", "Jamaica", "Jordan", "Japan", "Kenya", "Kyrgyzstan",
|
'Cook Islands', 'Chile', 'Cameroon', 'China', 'Colombia', 'Costa Rica',
|
||||||
"Cambodia", "Kiribati", "Comoros", "Saint Kitts and Nevis",
|
'Cuba', 'Cape Verde', 'Christmas Island', 'Cyprus', 'Czech Republic',
|
||||||
"Korea, Democratic People's Republic of",
|
'Germany', 'Djibouti', 'Denmark', 'Dominica', 'Dominican Republic',
|
||||||
"Korea, Republic of", "Kuwait", "Cayman Islands",
|
'Algeria', 'Ecuador', 'Estonia', 'Egypt', 'Western Sahara', 'Eritrea',
|
||||||
"Kazakstan", "Lao People's Democratic Republic", "Lebanon", "Saint Lucia",
|
'Spain', 'Ethiopia', 'Finland', 'Fiji', 'Falkland Islands (Malvinas)',
|
||||||
"Liechtenstein", "Sri Lanka", "Liberia", "Lesotho", "Lithuania", "Luxembourg",
|
'Micronesia, Federated States of', 'Faroe Islands', 'France',
|
||||||
"Latvia", "Libyan Arab Jamahiriya", "Morocco", "Monaco", "Moldova, Republic of",
|
'France, Metropolitan', 'Gabon', 'United Kingdom', 'Grenada', 'Georgia',
|
||||||
"Madagascar", "Marshall Islands", "Macedonia",
|
'French Guiana', 'Ghana', 'Gibraltar', 'Greenland', 'Gambia', 'Guinea',
|
||||||
"Mali", "Myanmar", "Mongolia", "Macau", "Northern Mariana Islands",
|
'Guadeloupe', 'Equatorial Guinea', 'Greece',
|
||||||
"Martinique", "Mauritania", "Montserrat", "Malta", "Mauritius", "Maldives",
|
'South Georgia and the South Sandwich Islands', 'Guatemala', 'Guam',
|
||||||
"Malawi", "Mexico", "Malaysia", "Mozambique", "Namibia", "New Caledonia",
|
'Guinea-Bissau', 'Guyana', 'Hong Kong',
|
||||||
"Niger", "Norfolk Island", "Nigeria", "Nicaragua", "Netherlands", "Norway",
|
'Heard Island and McDonald Islands', 'Honduras', 'Croatia', 'Haiti',
|
||||||
"Nepal", "Nauru", "Niue", "New Zealand", "Oman", "Panama", "Peru", "French Polynesia",
|
'Hungary', 'Indonesia', 'Ireland', 'Israel', 'India',
|
||||||
"Papua New Guinea", "Philippines", "Pakistan", "Poland", "Saint Pierre and Miquelon",
|
'British Indian Ocean Territory', 'Iraq', 'Iran, Islamic Republic of',
|
||||||
"Pitcairn Islands", "Puerto Rico", "Palestinian Territory",
|
'Iceland', 'Italy', 'Jamaica', 'Jordan', 'Japan', 'Kenya', 'Kyrgyzstan',
|
||||||
"Portugal", "Palau", "Paraguay", "Qatar", "Reunion", "Romania",
|
'Cambodia', 'Kiribati', 'Comoros', 'Saint Kitts and Nevis',
|
||||||
"Russian Federation", "Rwanda", "Saudi Arabia", "Solomon Islands",
|
'Korea, Democratic People\'s Republic of', 'Korea, Republic of', 'Kuwait',
|
||||||
"Seychelles", "Sudan", "Sweden", "Singapore", "Saint Helena", "Slovenia",
|
'Cayman Islands', 'Kazakhstan', 'Lao People\'s Democratic Republic',
|
||||||
"Svalbard and Jan Mayen", "Slovakia", "Sierra Leone", "San Marino", "Senegal",
|
'Lebanon', 'Saint Lucia', 'Liechtenstein', 'Sri Lanka', 'Liberia',
|
||||||
"Somalia", "Suriname", "Sao Tome and Principe", "El Salvador", "Syrian Arab Republic",
|
'Lesotho', 'Lithuania', 'Luxembourg', 'Latvia', 'Libya', 'Morocco',
|
||||||
"Swaziland", "Turks and Caicos Islands", "Chad", "French Southern Territories",
|
'Monaco', 'Moldova, Republic of', 'Madagascar', 'Marshall Islands',
|
||||||
"Togo", "Thailand", "Tajikistan", "Tokelau", "Turkmenistan",
|
'Macedonia', 'Mali', 'Myanmar', 'Mongolia', 'Macau',
|
||||||
"Tunisia", "Tonga", "Timor-Leste", "Turkey", "Trinidad and Tobago", "Tuvalu",
|
'Northern Mariana Islands', 'Martinique', 'Mauritania', 'Montserrat',
|
||||||
"Taiwan", "Tanzania, United Republic of", "Ukraine",
|
'Malta', 'Mauritius', 'Maldives', 'Malawi', 'Mexico', 'Malaysia',
|
||||||
"Uganda", "United States Minor Outlying Islands", "United States", "Uruguay",
|
'Mozambique', 'Namibia', 'New Caledonia', 'Niger', 'Norfolk Island',
|
||||||
"Uzbekistan", "Holy See (Vatican City State)", "Saint Vincent and the Grenadines",
|
'Nigeria', 'Nicaragua', 'Netherlands', 'Norway', 'Nepal', 'Nauru', 'Niue',
|
||||||
"Venezuela", "Virgin Islands, British", "Virgin Islands, U.S.",
|
'New Zealand', 'Oman', 'Panama', 'Peru', 'French Polynesia',
|
||||||
"Vietnam", "Vanuatu", "Wallis and Futuna", "Samoa", "Yemen", "Mayotte",
|
'Papua New Guinea', 'Philippines', 'Pakistan', 'Poland',
|
||||||
"Serbia", "South Africa", "Zambia", "Montenegro", "Zimbabwe",
|
'Saint Pierre and Miquelon', 'Pitcairn Islands', 'Puerto Rico',
|
||||||
"Anonymous Proxy","Satellite Provider","Other",
|
'Palestinian Territory', 'Portugal', 'Palau', 'Paraguay', 'Qatar',
|
||||||
"Aland Islands","Guernsey","Isle of Man","Jersey","Saint Barthelemy","Saint Martin"
|
'Reunion', 'Romania', 'Russian Federation', 'Rwanda', 'Saudi Arabia',
|
||||||
)
|
'Solomon Islands', 'Seychelles', 'Sudan', 'Sweden', 'Singapore',
|
||||||
|
'Saint Helena', 'Slovenia', 'Svalbard and Jan Mayen', 'Slovakia',
|
||||||
# storage / caching flags
|
'Sierra Leone', 'San Marino', 'Senegal', 'Somalia', 'Suriname',
|
||||||
STANDARD = 0
|
'Sao Tome and Principe', 'El Salvador', 'Syrian Arab Republic',
|
||||||
MEMORY_CACHE = 1
|
'Swaziland', 'Turks and Caicos Islands', 'Chad',
|
||||||
MMAP_CACHE = 8
|
'French Southern Territories', 'Togo', 'Thailand', 'Tajikistan', 'Tokelau',
|
||||||
|
'Turkmenistan', 'Tunisia', 'Tonga', 'Timor-Leste', 'Turkey',
|
||||||
# Database structure constants
|
'Trinidad and Tobago', 'Tuvalu', 'Taiwan', 'Tanzania, United Republic of',
|
||||||
COUNTRY_BEGIN = 16776960
|
'Ukraine', 'Uganda', 'United States Minor Outlying Islands',
|
||||||
STATE_BEGIN_REV0 = 16700000
|
'United States', 'Uruguay', 'Uzbekistan', 'Holy See (Vatican City State)',
|
||||||
STATE_BEGIN_REV1 = 16000000
|
'Saint Vincent and the Grenadines', 'Venezuela', 'Virgin Islands, British',
|
||||||
|
'Virgin Islands, U.S.', 'Vietnam', 'Vanuatu', 'Wallis and Futuna', 'Samoa',
|
||||||
STRUCTURE_INFO_MAX_SIZE = 20
|
'Yemen', 'Mayotte', 'Serbia', 'South Africa', 'Zambia', 'Montenegro',
|
||||||
DATABASE_INFO_MAX_SIZE = 100
|
'Zimbabwe', 'Anonymous Proxy', 'Satellite Provider', 'Other',
|
||||||
|
'Aland Islands', 'Guernsey', 'Isle of Man', 'Jersey', 'Saint Barthelemy',
|
||||||
# Database editions
|
'Saint Martin', 'Bonaire, Sint Eustatius and Saba', 'South Sudan'
|
||||||
COUNTRY_EDITION = 1
|
)
|
||||||
REGION_EDITION_REV0 = 7
|
|
||||||
REGION_EDITION_REV1 = 3
|
CONTINENT_NAMES = (
|
||||||
CITY_EDITION_REV0 = 6
|
'--', 'AS', 'EU', 'EU', 'AS', 'AS', 'NA', 'NA', 'EU', 'AS', 'NA', 'AF',
|
||||||
CITY_EDITION_REV1 = 2
|
'AN', 'SA', 'OC', 'EU', 'OC', 'NA', 'AS', 'EU', 'NA', 'AS', 'EU', 'AF',
|
||||||
ORG_EDITION = 5
|
'EU', 'AS', 'AF', 'AF', 'NA', 'AS', 'SA', 'SA', 'NA', 'AS', 'AN', 'AF',
|
||||||
ISP_EDITION = 4
|
'EU', 'NA', 'NA', 'AS', 'AF', 'AF', 'AF', 'EU', 'AF', 'OC', 'SA', 'AF',
|
||||||
PROXY_EDITION = 8
|
'AS', 'SA', 'NA', 'NA', 'AF', 'AS', 'AS', 'EU', 'EU', 'AF', 'EU', 'NA',
|
||||||
ASNUM_EDITION = 9
|
'NA', 'AF', 'SA', 'EU', 'AF', 'AF', 'AF', 'EU', 'AF', 'EU', 'OC', 'SA',
|
||||||
NETSPEED_EDITION = 11
|
'OC', 'EU', 'EU', 'NA', 'AF', 'EU', 'NA', 'AS', 'SA', 'AF', 'EU', 'NA',
|
||||||
COUNTRY_EDITION_V6 = 12
|
'AF', 'AF', 'NA', 'AF', 'EU', 'AN', 'NA', 'OC', 'AF', 'SA', 'AS', 'AN',
|
||||||
|
'NA', 'EU', 'NA', 'EU', 'AS', 'EU', 'AS', 'AS', 'AS', 'AS', 'AS', 'EU',
|
||||||
SEGMENT_RECORD_LENGTH = 3
|
'EU', 'NA', 'AS', 'AS', 'AF', 'AS', 'AS', 'OC', 'AF', 'NA', 'AS', 'AS',
|
||||||
STANDARD_RECORD_LENGTH = 3
|
'AS', 'NA', 'AS', 'AS', 'AS', 'NA', 'EU', 'AS', 'AF', 'AF', 'EU', 'EU',
|
||||||
ORG_RECORD_LENGTH = 4
|
'EU', 'AF', 'AF', 'EU', 'EU', 'AF', 'OC', 'EU', 'AF', 'AS', 'AS', 'AS',
|
||||||
MAX_RECORD_LENGTH = 4
|
'OC', 'NA', 'AF', 'NA', 'EU', 'AF', 'AS', 'AF', 'NA', 'AS', 'AF', 'AF',
|
||||||
MAX_ORG_RECORD_LENGTH = 300
|
'OC', 'AF', 'OC', 'AF', 'NA', 'EU', 'EU', 'AS', 'OC', 'OC', 'OC', 'AS',
|
||||||
FULL_RECORD_LENGTH = 50
|
'NA', 'SA', 'OC', 'OC', 'AS', 'AS', 'EU', 'NA', 'OC', 'NA', 'AS', 'EU',
|
||||||
|
'OC', 'SA', 'AS', 'AF', 'EU', 'EU', 'AF', 'AS', 'OC', 'AF', 'AF', 'EU',
|
||||||
US_OFFSET = 1
|
'AS', 'AF', 'EU', 'EU', 'EU', 'AF', 'EU', 'AF', 'AF', 'SA', 'AF', 'NA',
|
||||||
CANADA_OFFSET = 677
|
'AS', 'AF', 'NA', 'AF', 'AN', 'AF', 'AS', 'AS', 'OC', 'AS', 'AF', 'OC',
|
||||||
WORLD_OFFSET = 1353
|
'AS', 'EU', 'NA', 'OC', 'AS', 'AF', 'EU', 'AF', 'OC', 'NA', 'SA', 'AS',
|
||||||
FIPS_RANGE = 360
|
'EU', 'NA', 'SA', 'NA', 'NA', 'AS', 'OC', 'OC', 'OC', 'AS', 'AF', 'EU',
|
||||||
|
'AF', 'AF', 'EU', 'AF', '--', '--', '--', 'EU', 'EU', 'EU', 'EU', 'NA',
|
||||||
|
'NA', 'NA', 'AF'
|
||||||
|
)
|
||||||
|
|
||||||
|
# storage / caching flags
|
||||||
|
STANDARD = 0
|
||||||
|
MEMORY_CACHE = 1
|
||||||
|
MMAP_CACHE = 8
|
||||||
|
|
||||||
|
# Database structure constants
|
||||||
|
COUNTRY_BEGIN = 16776960
|
||||||
|
STATE_BEGIN_REV0 = 16700000
|
||||||
|
STATE_BEGIN_REV1 = 16000000
|
||||||
|
|
||||||
|
STRUCTURE_INFO_MAX_SIZE = 20
|
||||||
|
DATABASE_INFO_MAX_SIZE = 100
|
||||||
|
|
||||||
|
# Database editions
|
||||||
|
COUNTRY_EDITION = 1
|
||||||
|
COUNTRY_EDITION_V6 = 12
|
||||||
|
REGION_EDITION_REV0 = 7
|
||||||
|
REGION_EDITION_REV1 = 3
|
||||||
|
CITY_EDITION_REV0 = 6
|
||||||
|
CITY_EDITION_REV1 = 2
|
||||||
|
CITY_EDITION_REV1_V6 = 30
|
||||||
|
ORG_EDITION = 5
|
||||||
|
ISP_EDITION = 4
|
||||||
|
ASNUM_EDITION = 9
|
||||||
|
ASNUM_EDITION_V6 = 21
|
||||||
|
# Not yet supported databases
|
||||||
|
PROXY_EDITION = 8
|
||||||
|
NETSPEED_EDITION = 11
|
||||||
|
|
||||||
|
# Collection of databases
|
||||||
|
IPV6_EDITIONS = (COUNTRY_EDITION_V6, ASNUM_EDITION_V6, CITY_EDITION_REV1_V6)
|
||||||
|
CITY_EDITIONS = (CITY_EDITION_REV0, CITY_EDITION_REV1, CITY_EDITION_REV1_V6)
|
||||||
|
REGION_EDITIONS = (REGION_EDITION_REV0, REGION_EDITION_REV1)
|
||||||
|
REGION_CITY_EDITIONS = REGION_EDITIONS + CITY_EDITIONS
|
||||||
|
|
||||||
|
SEGMENT_RECORD_LENGTH = 3
|
||||||
|
STANDARD_RECORD_LENGTH = 3
|
||||||
|
ORG_RECORD_LENGTH = 4
|
||||||
|
MAX_RECORD_LENGTH = 4
|
||||||
|
MAX_ORG_RECORD_LENGTH = 300
|
||||||
|
FULL_RECORD_LENGTH = 50
|
||||||
|
|
||||||
|
US_OFFSET = 1
|
||||||
|
CANADA_OFFSET = 677
|
||||||
|
WORLD_OFFSET = 1353
|
||||||
|
FIPS_RANGE = 360
|
||||||
|
ENCODING = 'iso-8859-1'
|
||||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -1,42 +1,36 @@
|
||||||
"""
|
# -*- coding: utf-8 -*-
|
||||||
Misc. utility functions. It is part of the pygeoip package.
|
"""
|
||||||
|
Utility functions. Part of the pygeoip package.
|
||||||
@author: Jennifer Ennis <zaylea at gmail dot com>
|
|
||||||
|
@author: Jennifer Ennis <zaylea@gmail.com>
|
||||||
@license:
|
|
||||||
Copyright(C) 2004 MaxMind LLC
|
@license: Copyright(C) 2004 MaxMind LLC
|
||||||
|
|
||||||
This program is free software: you can redistribute it and/or modify
|
This program is free software: you can redistribute it and/or modify
|
||||||
it under the terms of the GNU Lesser General Public License as published by
|
it under the terms of the GNU Lesser General Public License as published by
|
||||||
the Free Software Foundation, either version 3 of the License, or
|
the Free Software Foundation, either version 3 of the License, or
|
||||||
(at your option) any later version.
|
(at your option) any later version.
|
||||||
|
|
||||||
This program is distributed in the hope that it will be useful,
|
This program is distributed in the hope that it will be useful,
|
||||||
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
GNU General Public License for more details.
|
GNU General Public License for more details.
|
||||||
|
|
||||||
You should have received a copy of the GNU Lesser General Public License
|
You should have received a copy of the GNU Lesser General Public License
|
||||||
along with this program. If not, see <http://www.gnu.org/licenses/lgpl.txt>.
|
along with this program. If not, see <http://www.gnu.org/licenses/lgpl.txt>.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
import six
|
import socket
|
||||||
|
import binascii
|
||||||
def ip2long(ip):
|
|
||||||
"""
|
|
||||||
Convert a IPv4 address into a 32-bit integer.
|
def ip2long(ip):
|
||||||
|
"""
|
||||||
@param ip: quad-dotted IPv4 address
|
Wrapper function for IPv4 and IPv6 converters
|
||||||
@type ip: str
|
@param ip: IPv4 or IPv6 address
|
||||||
@return: network byte order 32-bit integer
|
@type ip: str
|
||||||
@rtype: int
|
"""
|
||||||
"""
|
try:
|
||||||
ip_array = ip.split('.')
|
return int(binascii.hexlify(socket.inet_aton(ip)), 16)
|
||||||
|
except socket.error:
|
||||||
if six.PY3:
|
return int(binascii.hexlify(socket.inet_pton(socket.AF_INET6, ip)), 16)
|
||||||
# int and long are unified in py3
|
|
||||||
ip_long = int(ip_array[0]) * 16777216 + int(ip_array[1]) * 65536 + int(ip_array[2]) * 256 + int(ip_array[3])
|
|
||||||
else:
|
|
||||||
ip_long = long(ip_array[0]) * 16777216 + long(ip_array[1]) * 65536 + long(ip_array[2]) * 256 + long(ip_array[3])
|
|
||||||
return ip_long
|
|
||||||
|
|
||||||
|
|
Reference in New Issue