New Features in Maple 2018 - Connectivity - Maplesoft

What's New in Maple 2018

Connectivity




Maple provides numerous connectivity options with other software tools, from data import and export using a wide variety of formats, to code generation, external calling, internet connectivity, and much more. Maple 2018 expands and enhances Maple’s connectivity toolset in several ways. 


Python

Maple 2018 is now packaged with a Python 3.6 kernel. 

The Python kernel is linked to Maple. This means you can execute Python scripts, and return results to Maple. 

Everyone who installs Maple will also have access to a Python interpreter with a selection of useful Python libraries.  In Maple, with(Python); loads a small set of tools to help execute commands, inspect variables, and interact with Python. 

with(Python); 1 

[EvalFunction, EvalMember, EvalString, GetVariable, ImportModule, None, SetVariable, Start, Stop]

Arbitrary strings can be parsed and evaluated: 

EvalString(

2

EvalString( 

8

EvalString( 

4747561509943

EvalString( 

7

Here's a slightly longer string to parse and evaluate after importing the "statistics" package: 

ImportModule( 

EvalString( 

Typesetting:-mprintslash([3.50000000000000], [HFloat(3.5)])

Evaluating as a string is easy and flexible, but you can go even further and make Python functions look like native Maple functions: 

mg := EvalString( 

Typesetting:-mprintslash([mg := >"], [">"])" align="center" border="0">

mg([1, 3, 3, 5, 7], interval = 2); 1 

Typesetting:-mprintslash([3.50000000000000], [HFloat(3.5)])

ImportModule(math); 1 

pysin := EvalString( 

Typesetting:-mprintslash([pysin := >"], [">"])" align="center" border="0">

pysin(1.0); 1 

Typesetting:-mprintslash([.841470984807897], [HFloat(0.8414709848078965)])

plot(('pysin')(x), x = `+`(`-`(Pi)) .. Pi); 1 

Plot_2d

Module :- member notation can be used to class methods: 

pymath := EvalString( 

pymath:-cos(1.0); 1 

Typesetting:-mprintslash([.540302305868140], [HFloat(0.5403023058681398)])

pymath:-acos(1.0); 1 

Typesetting:-mprintslash([0.], [HFloat(0.0)])

pymath:-sqrt(2); 1 

Typesetting:-mprintslash([1.41421356237310], [HFloat(1.4142135623730951)])

Here is a more involved example that uses a Python package for HTML parsing and manipulation. 

ImportModule( 




 

soup := EvalFunction(BeautifulSoup, htmldoc,  

soup:-h1:-string 

Great Novels

[seq(link:-text, `in`(link, soup:-find_all('a')))] 

[

New procedures can be defined from Maple: 

EvalString( 

pysum := GetVariable( 

pysum(1, 1); 1 

2

Options are available to control how results come back to Maple.  Above we see 'output'='none' to prevent any result from coming back.  Next, we'll use 'output' = 'python' to avoid converting back to a Maple object; and instead capture an object reference, in this case to an empty string.  This allows us access to Python string methods in the result. 

emptystring := EvalString( 

emptystring:-join([ 

abc

The Python process is in a separate memory space than Maple, and can be stopped and restarted at any time.   

EvalString( 

1, 2, 3

Stop(); 1 

true

 

 Error, (in Python:-GetVariable) name 'a' is not defined

EvalString( 

4

Deep Learning with TensorFlowTM

Maple 2018 includes a new package, DeepLearning, which offers an  API to a subset of the TensorFlow toolset for machine learning using neural networks. 


Example: Fitting a Curve

Here we perform least-squares regression to fit a Fourier series to a set of sample data given by: 

restart; -1
Points := rtable(1 .. 8, 1 .. 2, [[0., 2.1], [1., -1.5], [2., -3.1], [3., 6.3], [4., 8.2], [5., 11.5], [6., 12.7], [7., 8.4]], subtype = Matrix); -1 

The general formula for a Fourier series with `+`(`*`(2, `*`(N)), 1) terms and coefficients c, a__1, () .. (), a__N, b__1, () .. (), b__N is given by: 

F := proc (x, a, b, c, N) options operator, arrow; `+`(c, add(`+`(`*`(a[n], `*`(cos(`*`(n, `*`(x))))), `*`(b[n], `*`(sin(`*`(n, `*`(x)))))), n = 1 .. N)) end proc 

proc (x, a, b, c, N) options operator, arrow; `+`(c, add(`+`(`*`(a[n], `*`(cos(`*`(n, `*`(x))))), `*`(b[n], `*`(sin(`*`(n, `*`(x)))))), n = 1 .. N)) end proc

For this example we take N = 4:N := 4 

4

We first declare a sequence of variables in the deep learning graph corresponding to each a[n] and b[n]

with(DeepLearning); -1 

a := Array(1 .. N, proc (i) options operator, arrow; Variable([.5], datatype = float[4]) end proc); -1 

b := Array(1 .. N, proc (i) options operator, arrow; Variable([.5], datatype = float[4]) end proc); -1 

c := Variable([.5], datatype = float[4]); -1 

We can now define placeholders to hold the sample x- and y-values against which we want to fit F(x, a, b, N)

x := Placeholder(float[4]) 

Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mi(

y := Placeholder(float[4]) 

Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mi(

We can now define a least-squares distance function which we aim to minimize: 

loss := ReduceSum(`*`(`^`(`+`(F(x, a, b, c, N), `-`(y)), 2))) 

Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mi(

optimizer := Optimizer(GradientDescent(0.1e-1)); 1 

Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mi(

train := optimizer:-Minimize(loss); 1 

Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mi(

With the structure of our deep learning graph defined, we can proceed to train it on sample data.  After initializing the session, we run 1000 training cycles using the training data. 

x__training := convert(Points[() .. (), 1], list) 

[0., 1., 2., 3., 4., 5., 6., 7.]

y__training := convert(Points[() .. (), 2], list) 

[2.1, -1.5, -3.1, 6.3, 8.2, 11.5, 12.7, 8.4]

init := VariablesInitializer(); -1 

sess := Session() 

Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mi(

sess:-Run(init); -1 

for i to 1000 do sess:-Run(train, {`in`(x, x__training), `in`(y, y__training)}) end do; -1 

We can now query the state of the trained model for the present value of the loss function: 

sess:-Run(loss, [`in`(x, x__training), `in`(y, y__training)]) 

HFloat(5.359546048566699e-4)

As this is a small value, we have a close fit for the data.  We can then obtain the final value of the parameters from the trained model. 

result := sess:-Run([Concatenate(a, 0), Concatenate(b, 0), c], [`in`(x, x__training), `in`(y, y__training)]) 

Typesetting:-mfenced(Typesetting:-mrow(Typesetting:-mfenced(Typesetting:-mrow(Typesetting:-mtable(Typesetting:-mtr(Typesetting:-mtd(Typesetting:-mn(

A, B, C := result[1], result[2], result[3] 


Finally, we can visualize the result:
with(plots); -1 

display(dataplot(x__training, y__training), plot(F(t, A, B, C[1], N), t = min(x__training) .. max(x__training))) 

Plot_2d

Example: Classification 

Here we use a deep neural network to classify the famous Iris flower data set collected by Edgar Anderson and made famous by Ronald Fisher. This data set includes 150 distinct observations of iris flowers, each of which consists of four empirical observations (sepal length, sepal width, petal length, and petal width) along with a classification into one of three known species (I. setosa, I. versicolor, and I. virginica).

We will repeat here the classical task for which this data set is used: attempting prediction of the species based on the four measured quantities


Training and Test Data
We have divided the data in into training and test data: the former is used to build the model, the latter is used to test its predictive accuracy. 

training_data := Import( 

DataFrame(_rtable[18446744074844130830], rows = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, ...
Typesetting:-mrow(Typesetting:-mi(

We see that this data set has 150 samples (120 for training and 30 for testing) and that the Species column has three distinct species:NumRows(test_data), NumRows(training_data) 

30, 120

`union`(convert(test_data[Species], set), convert(training_data[Species], set)) 

{

To simplify things we will replace the strings designating the species classification with the numbers 0,1,2 (corresponding to setosa, versicolor, and virginica, respectively): 

training_set := map(eval, training_data, [ 

Typesetting:-mrow(Typesetting:-mi(

test_set := map(eval, test_data, [ 

Typesetting:-mrow(Typesetting:-mi(

Training the Deep Neural Network Model 

With our data prepared, we can now actually define and train the model. 

with(DeepLearning); -1

Our first step is to define a feature for each of the four observed quantities in the test data minus the final one (species) which we aim to predict: 

cols := ColumnLabels(test_set) 

Typesetting:-mprintslash([cols := [SepalLength, SepalWidth, PetalLength, PetalWidth, Species]], [[SepalLength, SepalWidth, PetalLength, PetalWidth, Species]])

fc := map(NumericColumn, cols[1 .. 4], shape = [1]) 

Typesetting:-mfenced(Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mo(
Typesetting:-mfenced(Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mo(
Typesetting:-mfenced(Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mo(
Typesetting:-mfenced(Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mo(

We can now define a deep neural network classifier with these features.  It has 3 classes because there are 3 species of iris in the dataset.

classifier := DNNClassifier(fc, hidden_units = [10, 20, 10], num_classes = 3); 1

WARNING:tensorflow:Using temporary folder as model directory: C:\Users\sforrest\AppData\Local\Temp\tmpl9jkz4pj
Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mi(

We are now ready to train the model.

classifier:-Train(training_set[1 .. 4], training_set[5], steps = 2000, num_epochs = none, shuffle = true)

Typesetting:-mrow(Typesetting:-mrow(Typesetting:-mi(

Now trained, we can evaluate the classifier on the test set, and we see that we have achieved 96.7% predictive accuracy:

results := classifier:-Evaluate(test_set[1 .. 4], test_set[5], shuffle = false, num_epochs = 1)

Matrix(%id = 18446744078244515838)

We can now build a predictor function that takes an arbitrary set of measurements as a DataSeries and returns a prediction:

predictor := proc (ds) classifier:-Predict(Transpose(DataFrame(ds)), num_epochs = 1, shuffle = false)[1] end proc; -1


Using a Trained Model 

With this we can take an arbitrary new point, and generate a prediction from the trained model:

ds := DataSeries([5.8, 3.1, 5.0, 1.7], labels = cols[1 .. 4])

Typesetting:-mrow(Typesetting:-mi(

predictor(ds)

Typesetting:-mrow(Typesetting:-mi(

The probabilities field in the above result records the estimated probabilities for each class.

In this case, the model estimates a 97.8% probability that this particular sample is class 2, and therefore predicts that it is I. virginica.


FileTools

The CanonicalPath command resolves any relative directories and symbolic links present in a given file path to construct a canonical version of this path.

> with(FileTools); -1
> CanonicalPath(
C:\Users

The IsLink command tests whether a given file path corresponds to a symbolic link.

> IsLink(
true

XMLTools

ToRecord formats an XML tree as a nested record

> with(XMLTools); -1

This example shows how repeated elements are put into a list, and the order they occurred can be deduced from the _order export.

> xml := ParseString(123"); -1" align="center" border="0">
> r := ToRecord(xml); -1
> Print(r)

 <doc>
   <a>1</a>
   <b>2</b>
   <a>3</a></doc>
> r:-doc:-a[1]
1
> r:-doc:-b
2
> r:-doc:-a[2]
3
> r:-doc:-_order
[

Additional Updates

The new MapleTA:-QTI command converts IMS Question & Test Interoperability (QTI) files into Maple T.A./Möbius course module.

OpenMaple enhances Java connectivity with several new commands, including toBigDecimal, toBigInteger, Relation, evalBoolean, lhs and set.