pymks.mks_structure_analysis.
MKSStructureAnalysis
(basis, correlations=None, dimension_reducer=None, n_components=None, periodic_axes=None, store_correlations=False, n_jobs=1, mean_center=True)¶MKSStructureAnalysis computes the 2point statistics for a set of microstructures and does dimensionality reduction. It can be used to evaluate the selection of spatial correlations and look at clustering of 2point statistics.
n_components
¶Number of components used by dimension_reducer.
dimension_reducer
¶Instance of a dimensionality reduction class.
correlations
¶spatial correlations to be computed
basis
¶instance of a basis class
reduced_fit_data
¶Low dimensionality representation of spatial correlations used to fit the components.
reduced_transformed_data
¶Reduced of spatial correlations.
periodic_axes
¶axes that are periodic. (0, 2) would indicate that axes x and z are periodic in a 3D microstrucure.
transformed_correlations
¶spatial correlations transform into the Low dimensional space.
Below is an example of using MKSStructureAnalysis using FastICA.
>>> from pymks.datasets import make_microstructure
>>> from pymks.bases import PrimitiveBasis
>>> from sklearn.decomposition import FastICA
>>> leg_basis = PrimitiveBasis(n_states=2, domain=[0, 1])
>>> reducer = FastICA(n_components=3)
>>> analyzer = MKSStructureAnalysis(basis=leg_basis, mean_center=False,
... dimension_reducer=reducer)
>>> X = make_microstructure(n_samples=4, size=(13, 13), grain_size=(3, 3))
>>> print(analyzer.fit_transform(X))
[[ 0.5 0.5 0.5]
[ 0.5 0.5 0.5]
[0.5 0.5 0.5]
[0.5 0.5 0.5]]
Create an instance of a MKSStructureAnalysis.
Parameters: 


fit
(X, reducer_labels=None, confidence_index=None)¶Fits data by using the 2point statistics for X to fits the components used in dimensionality reduction.
Parameters: 


Example
>>> from pymks.datasets import make_delta_microstructures
>>> from pymks import PrimitiveBasis
>>> n_states = 2
>>> analyzer = MKSStructureAnalysis(basis=PrimitiveBasis(n_states),
... n_components=1)
>>> np.random.seed(5)
>>> size = (2, 3, 3)
>>> X = np.random.randint(2, size=size)
>>> analyzer.fit(X)
>>> print(analyzer.dimension_reducer.components_.reshape(size)[0])
...
[[ 0.02886463 0.02886463 0.02886463]
[ 0.02886463 0.43874233 0.49647159]
[ 0.02886463 0.02886463 0.17896069]]
fit_transform
(X, confidence_index=None)¶Fits data by using the 2point statistics for X to fits the components used in dimensionality reduction and returns the reduction of the 2point statistics for X.
Parameters: 


Returns:  Reduction of the 2point statistics of X used to fit the components. 
Example
>>> from pymks.datasets import make_delta_microstructures
>>> from pymks import PrimitiveBasis
>>> n_states = 2
>>> analyzer = MKSStructureAnalysis(basis=PrimitiveBasis(n_states),
... n_components=1)
>>> np.random.seed(5)
>>> size = (2, 3, 3)
>>> X = np.random.randint(2, size=size)
>>>
>>> print(analyzer.fit_transform(X))
[[ 0.26731852]
[0.26731852]]
transform
(X, confidence_index=None)¶Computes the 2point statistics for X and applies dimensionality reduction.
Parameters: 


Returns:  Reduction of the 2point statistics of X. 
Example
>>> from pymks.datasets import make_delta_microstructures
>>> from pymks import PrimitiveBasis
>>> n_states = 2
>>> analyzer = MKSStructureAnalysis(basis=PrimitiveBasis(n_states),
... n_components=1)
>>> np.random.seed(5)
>>> size = (2, 3, 3)
>>> X = np.random.randint(2, size=size)
>>> print(analyzer.fit_transform(X))
[[ 0.26731852]
[0.26731852]]
>>> print(analyzer.transform(X))
[[ 0.26731852]
[0.26731852]]
pymks.mks_homogenization_model.
MKSHomogenizationModel
(basis=None, dimension_reducer=None, n_components=None, property_linker=None, degree=1, periodic_axes=None, correlations=None, compute_correlations=True, n_jobs=1, store_correlations=False, mean_center=True)¶The MKSHomogenizationModel takes in microstructures and a their associated macroscopic property, and created a low dimensional structure property linkage. The MKSHomogenizationModel model is designed to integrate with dimensionality reduction techniques and predictive models.
degree
¶Degree of the polynomial used by property_linker.
n_components
¶Number of components used by dimension_reducer.
dimension_reducer
¶Instance of a dimensionality reduction class.
property_linker
¶Instance of class that maps materials property to the microstuctures.
correlations
¶spatial correlations to be computed
basis
¶instance of a basis class
reduced_fit_data
¶Low dimensionality representation of spatial correlations used to fit the model.
reduced_predict_data
¶Low dimensionality representation of spatial correlations predicted by the model.
periodic_axes
¶axes that are periodic. (0, 2) would indicate that axes x and z are periodic in a 3D microstrucure.
coef_
¶Array of values that are the coefficients.
intercept_
¶Value that are the intercept
Below is an example of using MKSHomogenizationModel to predict (or classify) the type of microstructure using PCA and Logistic Regression.
>>> import numpy as np
>>> n_states = 3
>>> domain = [1, 1]
>>> from pymks.bases import LegendreBasis
>>> leg_basis = LegendreBasis(n_states=n_states, domain=domain)
>>> from sklearn.decomposition import PCA
>>> from sklearn.linear_model import LogisticRegression
>>> reducer = PCA(n_components=3)
>>> linker = LogisticRegression()
>>> model = MKSHomogenizationModel(
... basis=leg_basis, dimension_reducer=reducer, property_linker=linker)
>>> from pymks.datasets import make_cahn_hilliard
>>> X0, X1 = make_cahn_hilliard(n_samples=50)
>>> y0 = np.zeros(X0.shape[0])
>>> y1 = np.ones(X1.shape[0])
>>> X = np.concatenate((X0, X1))
>>> y = np.concatenate((y0, y1))
>>> model.fit(X, y)
>>> X0_test, X1_test = make_cahn_hilliard(n_samples=3)
>>> y0_test = model.predict(X0_test)
>>> y1_test = model.predict(X1_test)
>>> assert np.allclose(y0_test, [0, 0, 0])
>>> assert np.allclose(y1_test, [1, 1, 1])
Create an instance of a MKSHomogenizationModel.
Parameters: 


fit
(X, y, reduce_labels=None, confidence_index=None, size=None)¶Fits data by calculating 2point statistics from X, preforming dimension reduction using dimension_reducer, and fitting the reduced data with the property_linker.
Parameters: 


Example
Let’s first start with using the microstructure and effective properties.
>>> import numpy as np
>>> from sklearn.decomposition import PCA
>>> from sklearn.linear_model import LinearRegression
>>> from pymks.bases import PrimitiveBasis
>>> from pymks.stats import correlate
>>> reducer = PCA(n_components=2)
>>> linker = LinearRegression()
>>> prim_basis = PrimitiveBasis(n_states=2, domain=[0, 1])
>>> correlations = [(0, 0), (1, 1), (0, 1)]
>>> model = MKSHomogenizationModel(prim_basis,
... dimension_reducer=reducer,
... property_linker=linker,
... correlations=correlations)
>>> np.random.seed(99)
>>> X = np.random.randint(2, size=(3, 15))
>>> y = np.array([1, 2, 3])
>>> model.fit(X, y)
>>> X_stats = correlate(X, prim_basis)
>>> X_reshaped = X_stats.reshape((X_stats.shape[0], 1))
>>> X_pca = reducer.fit_transform(X_reshaped  np.mean(X_reshaped,
... axis=1)[:, None])
>>> assert np.allclose(model.reduced_fit_data, X_pca)
Now let’s use the same method with spatial correlations instead of microtructures.
>>> from sklearn.decomposition import PCA
>>> from sklearn.linear_model import LinearRegression
>>> from pymks.bases import PrimitiveBasis
>>> from pymks.stats import correlate
>>> reducer = PCA(n_components=2)
>>> linker = LinearRegression()
>>> prim_basis = PrimitiveBasis(n_states=2, domain=[0, 1])
>>> correlations = [(0, 0), (1, 1), (0, 1)]
>>> model = MKSHomogenizationModel(dimension_reducer=reducer,
... property_linker=linker,
... compute_correlations=False)
>>> np.random.seed(99)
>>> X = np.random.randint(2, size=(3, 15))
>>> y = np.array([1, 2, 3])
>>> X_stats = correlate(X, prim_basis, correlations=correlations)
>>> model.fit(X_stats, y)
>>> X_reshaped = X_stats.reshape((X_stats.shape[0], X_stats[0].size))
>>> X_pca = reducer.fit_transform(X_reshaped  np.mean(X_reshaped,
... axis=1)[:, None])
>>> assert np.allclose(model.reduced_fit_data, X_pca)
predict
(X, confidence_index=None)¶Predicts macroscopic property for the microstructures X.
Parameters: 


Returns:  The predicted macroscopic property for X. 
Example
>>> import numpy as np
>>> from sklearn.manifold import LocallyLinearEmbedding
>>> from sklearn.linear_model import BayesianRidge
>>> from pymks.bases import PrimitiveBasis
>>> np.random.seed(1)
>>> X = np.random.randint(2, size=(50, 100))
>>> y = np.random.random(50)
>>> reducer = LocallyLinearEmbedding()
>>> linker = BayesianRidge()
>>> prim_basis = PrimitiveBasis(2, domain=[0, 1])
>>> model = MKSHomogenizationModel(prim_basis, n_components=2,
... dimension_reducer=reducer,
... property_linker=linker)
>>> model.fit(X, y)
>>> X_test = np.random.randint(2, size=(1, 100))
Predict with microstructures
>>> y_pred = model.predict(X_test)
Predict with spatial correlations
>>> from pymks.stats import correlate
>>> model.compute_correlations = False
>>> X_corr = correlate(X, prim_basis, correlations=[(0, 0)])
>>> model.fit(X_corr, y)
>>> X_corr_test = correlate(X_test, prim_basis,
... correlations=[(0, 0)])
>>> y_pred_stats = model.predict(X_corr_test)
>>> assert np.allclose(y_pred_stats, y_pred, atol=1e3)
score
(X, y, confidence_index=None)¶The score function for the MKSHomogenizationModel. It formats the data and uses the score method from the property_linker.
Parameters: 


Returns:  Score for MKSHomogenizationModel from the selected property_linker. 
pymks.mks_localization_model.
MKSLocalizationModel
(basis, n_states=None, n_jobs=1, lstsq_rcond=None)¶The MKSLocalizationModel fits data using the Materials Knowledge System in Fourier Space. The following demonstrates the viability of the MKSLocalizationModel with a simple 1D filter.
basis
¶Basis function used to discretize the microstucture.
n_states
¶Interger value for number of local states, if a basis is specified, n_states indicates the order of the polynomial.
coef_
¶Array of values that are the influence coefficients.
>>> n_states = 2
>>> n_spaces = 81
>>> n_samples = 400
Define a filter function.
>>> def filter(x):
... return np.where(x < 10,
... np.exp(abs(x)) * np.cos(x * np.pi),
... np.exp(abs(x  20)) * np.cos((x  20) * np.pi))
Use the filter function to construct some coefficients.
>>> coef_ = np.linspace(1, 0, n_states)[None,:] * filter(np.linspace(0, 20,
... n_spaces))[:,None]
>>> Fcoef_ = np.fft.fft(coef_, axis=0)
Make some test samples.
>>> np.random.seed(2)
>>> X = np.random.random((n_samples, n_spaces))
Construct a response with the Fcoef_.
>>> H = np.linspace(0, 1, n_states)
>>> X_ = np.maximum(1  abs(X[:,:,None]  H) / (H[1]  H[0]), 0)
>>> FX = np.fft.fft(X_, axis=1)
>>> Fy = np.sum(Fcoef_[None] * FX, axis=1)
>>> y = np.fft.ifft(Fy, axis=1).real
Use the MKSLocalizationModel to reconstruct the coefficients
>>> from .bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(n_states, [0, 1])
>>> model = MKSLocalizationModel(basis=prim_basis)
>>> model.fit(X, y)
Check the result
>>> assert np.allclose(np.fft.fftshift(coef_, axes=(0,)), model.coef_)
Instantiate a MKSLocalizationModel.
Parameters: 


coef_
¶Returns the coefficients in real space with origin shifted to the center.
fit
(X, y, size=None)¶Fits the data by calculating a set of influence coefficients.
Parameters: 


Example
>>> X = np.linspace(0, 1, 4).reshape((1, 2, 2))
>>> y = X.swapaxes(1, 2)
>>> from .bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(2, [0, 1])
>>> model = MKSLocalizationModel(basis=prim_basis)
>>> model.fit(X, y)
>>> assert np.allclose(model._filter._Fkernel, [[[ 0.5, 0.5],
... [ 2, 0]],
... [[0.5, 0 ],
... [ 1, 0 ]]])
predict
(X)¶Predicts a new response from the microstructure function X with calibrated influence coefficients.
Parameters:  X (ND array) – The microstructure, an (n_samples, n_x, ...) shaped array where n_samples is the number of samples and n_x is the spatial discretization. 

Returns:  The predicted response field the same shape as X. 
Example
>>> X = np.linspace(0, 1, 4).reshape((1, 2, 2))
>>> y = X.swapaxes(1, 2)
>>> from .bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(2, [0, 1])
>>> model = MKSLocalizationModel(basis=prim_basis)
>>> model.fit(X, y)
>>> assert np.allclose(y, model.predict(X))
The fit method must be called to calibrate the coefficients before the predict method can be used.
>>> MKSModel = MKSLocalizationModel(prim_basis)
>>> MKSModel.predict(X)
Traceback (most recent call last):
...
AttributeError: fit() method must be run before predict().
resize_coeff
(size)¶Scale the size of the coefficients and pad with zeros.
Parameters:  size (tuple) – The new size of the influence coefficients. 

Returns:  The resized influence coefficients to size. 
Example
Let’s first instantitate a model and fabricate some coefficients.
>>> from pymks.bases import PrimitiveBasis
>>> prim_basis = PrimitiveBasis(n_states=1)
>>> prim_basis._axes = np.array([1, 2])
>>> prim_basis._axes_shape = (5, 4)
>>> model = MKSLocalizationModel(prim_basis)
>>> coef_ = np.arange(20).reshape((1, 5, 4, 1))
>>> coef_ = np.concatenate((coef_, np.ones_like(coef_)), axis=1)
>>> coef_ = np.fft.ifftshift(coef_, axes=(1, 2))
>>> model._filter = Filter(np.fft.rfftn(coef_, axes=(1, 2)),
... prim_basis)
The coefficients can be reshaped by passing the new shape that coefficients should have.
>>> model.resize_coeff((10, 7))
>>> assert np.allclose(model.coef_[... ,0],
... [[0, 0, 0, 0, 0, 0, 0],
... [0, 0, 0, 0, 0, 0, 0],
... [0, 0, 0, 0, 0, 0, 0],
... [0, 0, 0, 1, 2, 3, 0],
... [0, 0, 4, 5, 6, 7, 0],
... [0, 0, 8, 9,10,11, 0],
... [0, 0,12,13,14,15, 0],
... [0, 0,16,17,18,19, 0],
... [0, 0, 0, 0, 0, 0, 0],
... [0, 0, 0, 0, 0, 0, 0]])