Module Contents


NeuralNetworkShaper(self,model,useInputAndOutputShapes=True) This class computes the intermediate tensor shapes for a neural network model.
MLModel(self,model) This class defines the minimal interface to a CoreML object in Python.


class _FeatureDescription(fd_spec)
__setitem__(key, value)
class NeuralNetworkShaper(model, useInputAndOutputShapes=True)

This class computes the intermediate tensor shapes for a neural network model.

__init__(model, useInputAndOutputShapes=True)
class MLModel(model)

This class defines the minimal interface to a CoreML object in Python.

At a high level, the protobuf specification consists of:

  • Model description: Encodes names and type information of the inputs and outputs to the model.
  • Model parameters: The set of parameters required to represent a specific instance of the model.
  • Metadata: Information about the origin, license, and author of the model.

With this class, you can inspect a CoreML model, modify metadata, and make predictions for the purposes of testing (on select platforms).

# Load the model
>>> model =  MLModel('HousePricer.mlmodel')

# Set the model metadata
>>> = 'Author'
>>> model.license = 'BSD'
>>> model.short_description = 'Predicts the price of a house in the Seattle area.'

# Get the interface to the model
>>> model.input_descriptions
>>> model.output_description

# Set feature descriptions manually
>>> model.input_description['bedroom'] = 'Number of bedrooms'
>>> model.input_description['bathrooms'] = 'Number of bathrooms'
>>> model.input_description['size'] = 'Size (in square feet)'

# Set
>>> model.output_description['price'] = 'Price of the house'

# Make predictions
>>> predictions = model.predict({'bedroom': 1.0, 'bath': 1.0, 'size': 1240})

# Get the spec of the model
>>> model.spec

# Save the model



Construct an MLModel from a .mlmodel

model: str | Model_pb2
If a string is given it should be the location of the .mlmodel to load.
>>> loaded_model = MLModel('my_model_file.mlmodel')

Save the model to a .mlmodel format.

location : str
Target filename for the model.


>>> loaded_model = MLModel('my_model_file.mlmodel')

Get a deep copy of the protobuf specification of the model.

model: Model_pb2
Protobuf specification of the model.
>>> spec = model.get_spec()
predict(data, useCPUOnly=False, **kwargs)

Return predictions for the model. The kwargs gets passed into the model as a dictionary.

data : dict[str, value]
Dictionary of data to make predictions from where the keys are the names of the input features.
useCPUOnly : bool
Set to true to restrict computation to use only the CPU. Defaults to False.
out : dict[str, value]
Predictions as a dictionary where each key is the output feature name.
>>> data = {'bedroom': 1.0, 'bath': 1.0, 'size': 1240}
>>> predictions = model.predict(data)
visualize_spec(port=None, input_shape_dict=None)

Visualize the model.

port : int
if server is to be hosted on specific localhost port
input_shape_dict : dict
The shapes are calculated assuming the batch and sequence are 1 i.e. (1, 1, C, H, W). If either is not 1, then provide full input shape


>>> model = coreml.models.MLModel('HousePricer.mlmodel')
>>> model.visualize_spec()