models.model

Module Contents

Classes

_FeatureDescription(self,fd_spec)
NeuralNetworkShaper(self,model,useInputAndOutputShapes=True) This class computes the intermediate tensor shapes for a neural network model.
MLModel(self,model) This class defines the minimal interface to a CoreML object in Python.

Functions

_get_proxy_from_spec(filename)
class _FeatureDescription(fd_spec)
__init__(fd_spec)
__repr__()
__len__()
__getitem__(key)
__contains__(key)
__setitem__(key, value)
__iter__()
_get_proxy_from_spec(filename)
class NeuralNetworkShaper(model, useInputAndOutputShapes=True)

This class computes the intermediate tensor shapes for a neural network model.

__init__(model, useInputAndOutputShapes=True)
shape(name)
class MLModel(model)

This class defines the minimal interface to a CoreML object in Python.

At a high level, the protobuf specification consists of:

  • Model description: Encodes names and type information of the inputs and outputs to the model.
  • Model parameters: The set of parameters required to represent a specific instance of the model.
  • Metadata: Information about the origin, license, and author of the model.

With this class, you can inspect a CoreML model, modify metadata, and make predictions for the purposes of testing (on select platforms).

# Load the model
>>> model =  MLModel('HousePricer.mlmodel')

# Set the model metadata
>>> model.author = 'Author'
>>> model.license = 'BSD'
>>> model.short_description = 'Predicts the price of a house in the Seattle area.'

# Get the interface to the model
>>> model.input_descriptions
>>> model.output_description

# Set feature descriptions manually
>>> model.input_description['bedroom'] = 'Number of bedrooms'
>>> model.input_description['bathrooms'] = 'Number of bathrooms'
>>> model.input_description['size'] = 'Size (in square feet)'

# Set
>>> model.output_description['price'] = 'Price of the house'

# Make predictions
>>> predictions = model.predict({'bedroom': 1.0, 'bath': 1.0, 'size': 1240})

# Get the spec of the model
>>> model.spec

# Save the model
>>> model.save('HousePricer.mlmodel')

predict

__init__(model)

Construct an MLModel from a .mlmodel

model: str | Model_pb2
If a string is given it should be the location of the .mlmodel to load.
>>> loaded_model = MLModel('my_model_file.mlmodel')
short_description()
short_description(short_description)
input_description()
output_description()
user_defined_metadata()
author()
author(author)
license()
license(license)
__repr__()
__str__()
save(filename)

Save the model to a .mlmodel format.

location : str
Target filename for the model.

coremltools.utils.load_model

>>> model.save('my_model_file.mlmodel')
>>> loaded_model = MLModel('my_model_file.mlmodel')
get_spec()

Get a deep copy of the protobuf specification of the model.

model: Model_pb2
Protobuf specification of the model.
>>> spec = model.get_spec()
predict(data, useCPUOnly=False, **kwargs)

Return predictions for the model. The kwargs gets passed into the model as a dictionary.

data : dict[str, value]
Dictionary of data to make predictions from where the keys are the names of the input features.
useCPUOnly : bool
Set to true to restrict computation to use only the CPU. Defaults to False.
out : dict[str, value]
Predictions as a dictionary where each key is the output feature name.
>>> data = {'bedroom': 1.0, 'bath': 1.0, 'size': 1240}
>>> predictions = model.predict(data)
visualize_spec(port=None, input_shape_dict=None)

Visualize the model.

port : int
if server is to be hosted on specific localhost port
input_shape_dict : dict
The shapes are calculated assuming the batch and sequence are 1 i.e. (1, 1, C, H, W). If either is not 1, then provide full input shape

None

>>> model = coreml.models.MLModel('HousePricer.mlmodel')
>>> model.visualize_spec()