Legacy

Hazus Legacy methods and classes. This module interacts with and builds upon the Hazus Legacy desktop software.

hazus.legacy

Methods

getStudyRegions

legacy.getStudyRegions()

Gets all study region names imported into your local Hazus install

Returns:

studyRegions: list – study region names

members

Example:

# get list of study regions
studyRegions = hazus.legacy.getStudyRegions()

createExportObj

legacy.createExportObj()

Creates a dictionary to be used in the hazus.legacy.export method and hazus.legacy.Exporting class

Returns:

exportObj: dict – opt fields are boolean and decide options for exports. The rest of the fields are strings.

members

Example:

# create a template export object
exportObject = hazus.legacy.createExportObj()

# modify required keys
exportObject['study_region'] = 'My_Eq_StudyRegionName' # set the study region to export
exportObject['output_directory'] = r'C:\Users\user\disasters\event' # set the directory for the output files

# modify what you will export (True will export and False will not export)
exportObject['opt_csv'] = True # export data as CSVs
exportObject['opt_shp'] = True # export data as Shapefiles
exportObject['opt_report'] = True # export high level report
exportObject['opt_json'] = True # export data as json

# modify optional report keys
exportObject['title'] = 'Kiholo Earthquake M6.4' # report title
exportObject['meta'] = 'Shakemap version 5 US' # subtitle/meta data for the report

export

legacy.export()

Exports data from Hazus legacy. Can export CSVs, Shapefiles, PDF Reports, and Json | Use hazus.legacy.createExportObj() to create a base object for keyword arguments |

Keyword arguments:
exportObj: dictionary – {

opt_csv: boolean – export CSVs, opt_shp: boolean – export Shapefile(s), opt_report: boolean – export report, opt_json: boolean – export Json, study_region: str – name of the Hazus study region (HPR name), ?title: str – title on the report, ?meta: str – sub-title on the report (ex: Shakemap v5), output_directory: str – directory location for the outputs

}

members

Example:

# set up export object
studyRegions = hazus.legacy.getStudyRegions() # get all study regions
exportObject = hazus.legacy.createExportObj() # create a template export object
exportObject['study_region'] = studyRegions[0] # set the study region to export
exportObject['output_directory'] = r'C:\Users\user\disasters\event' # set the directory for the output files

# perform export
hazus.legacy.export(exportObject)

Classes

Exporting

class hazus.legacy.Exporting(exportObj)[source]

Export class for Hazus legacy. Can export CSVs, Shapefiles, PDF Reports, and Jsondatetime A combination of a date and a time. | Use hazus.legacy.createExportObj() to create a base object for keyword arguments | Exporting method logic follows: setup, getData, toCSV, toShapefile, toReport

Keyword arguments:
exportObj: dictionary – {

study_region: str – name of the Hazus study region (HPR name) | output_directory: str – directory location for the outputs ?title: str – title on the report | ?meta: str – sub-title on the report (ex: Shakemap v5) |

}

members

Exporting.setup()[source]

Establishes the connection to SQL Server

Exporting.getData()[source]

Queries and parses the data from SQL Server, preparing it for exporting

Exporting.toCSV()[source]

Exports the study region data to CSVs

Exporting.toShapefile()[source]

Exports the study region data to Shapefile(s)

Exporting.toReport()[source]

Exports the study region data to a one-page PDF report

Example:

# set up export object
studyRegions = hazus.legacy.getStudyRegions() # get all study regions
exportObject = hazus.legacy.createExportObj() # create a template export object
exportObject['study_region'] = studyRegions[0] # set the study region to export
exportObject['output_directory'] = r'C:\Users\user\disasters\event' # set the directory for the output files

# initialize export class
export = hazus.legacy.Exporting(exportObject)

# run export class methods
export.setup() # make database connections
export.getData() # get and parse the data from SQL Server
export.toCSV() # export data to CSV
export.toShapefile() # export data to Shapefile
export.toReport() # export data to a PDF

HazusDB

class hazus.legacy.HazusDB[source]

Creates a connection to the Hazus SQL Server database with methods to access databases, tables, and study regions

members

HazusDB.createConnection()[source]

Creates a connection object to the local Hazus SQL Server database

Returns:

conn: pyodbc connection

HazusDB.getDatabases()[source]

Creates a dataframe of all databases in your Hazus installation

Returns:

df: pandas dataframe

HazusDB.getTables(databaseName)[source]

Creates a dataframe of all tables in a database

Keyword Arguments:

databaseName: str – the name of the Hazus SQL Server database

Returns:

df: pandas dataframe

HazusDB.getStudyRegions()[source]

Creates a dataframe of all study regions in the local Hazus SQL Server database

Returns:

studyRegions: pandas dataframe

HazusDB.query(sql)[source]

Performs a SQL query on the Hazus SQL Server database

Keyword Arguments:

sql: str – a T-SQL query

Returns:

df: pandas dataframe

HazusDB.getHazardBoundary(databaseName)[source]

Fetches the hazard boundary from a Hazus SQL Server database

Keyword Arguments:

databaseName: str – the name of the database

Returns:

df: pandas dataframe – geometry in WKT

Example:

# set up Hazus database object
db = hazus.legacy.HazusDB() # initializes the HazusDB class

# create database connection object
conn = db.createConnection() # returns a connection to the Hazus SQL Server database

# predefined database queries
databases = db.getDatabases() # returns a dataframe of all databases in your Hazus installation
databaseName = databases.iloc[8][0] # gets a database name from the dataframe of database names
tables = db.getTables(databaseName) # returns all tables in a specified database
studyRegions = db.getStudyRegions() # returns a dataframe of all study regions

# custom query
sql = 'select * from DATABASE.dbo.TABLENAME'
selection = db.query(sql)

# get a geodataframe containing the hazard boundary
hazardGDF = db.getHazardBoundary(databaseName)