NeuroShell 2

Category Intelligent Software>Neural Network Systems/Tools

Abstract NeuroShell 2 is the software manufacturer’s legacy neural network (NN) product targeted towards computer science instructors and students. It contains classic algorithms and architectures popular with graduate school professors and computer science students. NeuroShell 2 combines advanced NN architectures, a Microsoft Windows icon driven user interface, utilities, and popular options to give users the ultimate NN experimental environment. It is recommended for academic users only, or those users who are concerned with classic NN paradigms like backpropagation.

Features/Capabilities include:

Beginner's Neural Networks - Product includes interfaces for beginners and experts alike. The Beginner's System in NeuroShell 2 is designed for novices and first-time users. It includes a simplified set of procedures for building and executing a complete, advanced NN application. The Beginner's System uses an enhanced backpropagation paradigm, and defaults network parameters such as learning rate, momentum, and number of hidden neurons. To use the system you enter data, specify the inputs and outputs, and train the network. You are then able to apply the trained network to new data and export the results to other programs.

Advanced Neural Networks - The Advanced System in NeuroShell 2 gives experienced NN users the ability to create and execute 16 different NN architectures, with more user control compared to the Beginner's System.

Spreadsheet Format - Users can choose to enter their data in their familiar spreadsheet program or work in the NeuroShell 2 Datagrid. Product uses spreadsheet files as its internal format so you can view or edit them yourself.

File Import and Export - Product imports and exports American Standard Code for Information Interchange (ASCII) and binary file formats. The File Export Module allows you to convert NeuroShell 2 files to ASCII or binary files. It will also merge data into an existing spreadsheet or print a file.

Data Preprocessing -

1) The Symbol Translate module converts alphanumeric data or strings into numbers which can be processed by the network. For example, you may want to convert "cold", "warm", and "hot" to 1, 2, and 3 respectively.

2) The Rules Module allows you to create If/Then/Else type rules to preprocess data prior to feeding it to the network. For example, you can use two input variables to create a third.

Note: You may also use the Rules Module to post-process the network's predictions and classifications.

Test Set Extract - NeuroShell 2 makes it easy to pull out test and production data sets from the training data. Test Set Extract Module offers four (4) different methods for selecting data:

1) N percent randomly chosen; 2) Every Nth pattern; 3) Block designation; 4) By row marker in the data; 5) Combination block designation and random.

Neural Network Architectures:

The Design Module offers a palette of 16 different neural network architectures/paradigms for different types of data.

Note: Users may customize each neural network architecture by setting parameters.

User Control - Users can specify their own learning rate, momentum, activation functions, and initial weight ranges on a layer basis in the Design Module. Elect rotational or random pattern selection. Choose multiple criteria for stopping training. Select different methods for handling missing data. Users can view weight values during training, including weights displayed in modified Hinton diagrams. Weights may be modified during learning.

Backpropagation Training - Twelve (12) of the architectures include the Ward Systems Group version of backpropagation, which has been enhanced for speed and accuracy.

Standard Connections - This is the standard type of backpropagation network in which every layer is connected or linked only to the previous layer.

Jump Connections - This is the type of backpropagation network in which every layer is connected or linked to every previous layer.

Recurrent Networks - This type of backpropagation network is often used in predicting financial markets because recurrent networks can learn sequences. Therefore, they are excellent for time series data.

A regular feed forward network responds to a given input pattern with exactly the same output every time the given input pattern is presented. A recurrent network may respond to the same input pattern differently at different times, depending upon the input patterns which have been presented previously. Recurrent networks build a long-term memory of internal neurons.

Ward Nets - This type of backpropagation network is able to detect different features in the data with the use of multiple slabs of neurons in the hidden layer, each with a different activation function. Activation functions are functions used internally in neurons to "fire" the neurons. When you apply a different activation function to each slab in the hidden layer, the network can discover novel features in a single pattern processed through the network.

All backpropagation algorithms come with a choice of Vanilla, Momentum, or TurboProp weight updates. With Vanilla, a learning rate is applied to the weight updates but a momentum term is Not. Momentum means that the weight updates Not only include the change dictated by learning rate, but include a portion of the last weight change as well.

All of the backpropagation networks include the Calibration feature, which prevents overtraining (thereby greatly reducing training time), and increases the network's ability to generalize well on new data.

TurboProp - This is a training method for feed forward networks that operates much faster than backpropagation networks. TurboProp offers users the additional advantage of Not requiring learning rate and momentum to be set.

Kohonen Architecture - The Kohonen Self Organizing Map network used in NeuroShell 2 is a type of unsupervised network, which means it has the ability to learn without being shown correct outputs in sample patterns. The networks are able to separate data into a specified number of categories.

PNN Architecture - Probabilistic Neural Networks (PNN) are known for their ability to train on sparse data sets and they train in only one pass of the training set! PNN separates data into a specified number of output categories. PNN networks are often able to function as soon as two (2) training patterns are available, so training can be incremental.

GRNN Architecture - Like PNN networks, General Regression Neural Networks (GRNN) are known for their ability to train in only one pass of the training set using sparse data sets. Rather than categorizing data like PNN, however, GRNN applications are able to produce continuous valued outputs.

GRNN is especially useful for continuous function approximation, and can fit multidimensional surfaces through data. In tests conducted by the manufacturer, they found that GRNN responds much better than Backpropagation to many types of problems.

Note: GRNN is Not the same as regression analysis!

Genetic Adaptive Nets - Manufacturers Genetic Adaptive feature uses a Genetic Algorithm to optimize the network structure of their GRNN and PNN nets. At the same time, the genetic algorithm eliminates bad inputs and gives you a sensitivity factor for the ones it keeps.

GMDH Architecture - NeuroShell 2 includes a very advanced architecture which is called Group Method of Data Handling (GMDH) or polynomial nets. GMDH neural nets derive a mathematical formula which is a nonlinear polynomial expression relating the values of the most important inputs to predict the output variable. The GMDH network is implemented with polynomial terms in the links and a genetic-like component to decide how many layers are built. The result of training at the output layer can be represented as a polynomial function of all or some of the inputs. GMDH can build very complex models while avoiding overfitting problems.

Calibration - Calibration solves one of the most difficult problems for neural networks -- knowing when to stop training. The longer the network learns the training set, the closer the network gets to "memorizing" the training set. If you present the network with a pattern that was in the training set, the network will be able to make a very accurate prediction. If, however, you present the network with a pattern that wasn't in the training set, the network may Not be able to "generalize" well on data it hasn't "seen" before if you have let the problem learn too long. Calibration corrects this.

For backpropagation networks, Calibration saves the network at the point where it gives the most accurate answers for patterns outside the training set.

For GRNN and PNN networks, Calibration finds the optimum smoothing factor, a parameter that is used when you apply the network. The Genetic Adaptive nets calibrate with the genetic algorithm.

Limits - Product has theoretical limits of 65,535 rows, 32,768 columns, and 16,000 neurons. However, neural nets will Not train effectively with much over 2,000 inputs, and some network types lose effectiveness at about 400 inputs and 5,000 to 10,000 rows (patterns), and may Not function at all with higher limits. Actually 65,535 rows are possible, but many older spreadsheet programs will display only 16,000 rows. NeuroShell 2 Datagrid displays 32,000.

Input Sensitivity - NeuroShell 2 has three (3) different ways of determining sensitivity of inputs depending on network type.

On-Line Tutorial - Product's on-line tutorial directs you through the steps required to use NeuroShell 2 and create a working application.

Two (2) tutorial example programs, including one which uses stock market data, are included in the on-line manual and on the distribution diskettes. Sample programs are included for backpropagation, Kohonen, PNN, and GRNN networks. Several other examples are also provided.

Graphics Capabilities - 1) Graphs variables across all patterns; 2) Graphs variable sets in a pattern 3) Correlation scatter plot displays a linear correlation coefficient between the two graphed variables; 4) High-low-close graph manifests trends in market data; 5) Training graphics included for each learning paradigm.

System Requirements

Windows Vista, Windows XP and Windows 2000 with SP4 are the only supported operating systems. The software may still work on Windows 95/98/Me/NT, but its performance is not guaranteed on those older systems.

If you run the operating systems above in a virtual window of a MAC (under Parallels for example) or Linux, our programs should work, but our technical support department cannot assist you except with Windows Vista, XP, and 2000 with SP4.

You must use a PC with an Intel compatible processor (like AMD), with at least 256MB RAM.

GeneHunter and NeuroShell Run-Time Server are compatible with Microsoft Excel versions up to 2007.

NeuroShell 2 can import from Microsoft Excel spreadsheets up to 2007. Internal files can be viewed by our own data grid program or Excel versions up to 2003.


Manufacturer Web Site Ward Systems Group, Inc.

Price Contact manufacturer.

G6G Abstract Number 20051

G6G Manufacturer Number 102960