DelValueAction(int nTargetNode, String sValue) { try { m_nTargetNode = nTargetNode; m_sValue = sValue; m_att = m_Instances.attribute(nTargetNode); SerializedObject so = new SerializedObject(m_Distributions[nTargetNode]); m_CPT = (Estimator[]) so.getObject(); ; m_children = new ArrayList<Integer>(); for (int iNode = 0; iNode < getNrOfNodes(); iNode++) { if (m_ParentSets[iNode].contains(nTargetNode)) { m_children.add(iNode); } } m_childAtts = new Estimator[m_children.size()][]; for (int iChild = 0; iChild < m_children.size(); iChild++) { int nChild = m_children.get(iChild); m_childAtts[iChild] = m_Distributions[nChild]; } } catch (Exception e) { e.printStackTrace(); } }
/** * Creates a given number of deep or shallow (if the kernel implements * Copyable) copies of the given kernel using serialization. * * @param model the kernel to copy * @param num the number of kernel copies to create. * @return an array of kernels. * @throws Exception if an error occurs */ public static Kernel[] makeCopies(Kernel model, int num) throws Exception { if (model == null) { throw new Exception("No model kernel set"); } Kernel[] kernels = new Kernel[num]; if (model instanceof Copyable) { for (int i = 0; i < kernels.length; i++) { kernels[i] = (Kernel) ((Copyable) model).copy(); } } else { SerializedObject so = new SerializedObject(model); for (int i = 0; i < kernels.length; i++) { kernels[i] = (Kernel) so.getObject(); } } return kernels; }
/** * Gets the current flow being edited. The flow is returned as a single Vector * containing two other Vectors: the beans and the connections. These two * vectors are deep-copied via serialization before being returned. * * @return the current flow being edited * @throws Exception if a problem occurs */ public Vector<Vector<?>> getFlow() throws Exception { Vector<Vector<?>> v = new Vector<Vector<?>>(); Vector<Object> beans = BeanInstance.getBeanInstances(m_mainKFPerspective .getCurrentTabIndex()); Vector<BeanConnection> connections = BeanConnection .getConnections(m_mainKFPerspective.getCurrentTabIndex()); detachFromLayout(beans); v.add(beans); v.add(connections); SerializedObject so = new SerializedObject(v); @SuppressWarnings("unchecked") Vector<Vector<?>> copy = (Vector<Vector<?>>) so.getObject(); // tempWrite(beans, connections); integrateFlow(beans, connections, true, false); return copy; }
/** * Get a named DefineFunction. Returns a deep copy of the function. * * @param functionName the name of the function to get * @return the named function or null if it cannot be found * @throws Exception if there is a problem deep copying the function */ protected DefineFunction getFunction(String functionName) throws Exception { DefineFunction copy = null; DefineFunction match = null; for (DefineFunction f : m_defineFunctions) { if (f.getName().equals(functionName)) { match = f; // System.err.println("Found a match!!!"); break; } } if (match != null) { SerializedObject so = new SerializedObject(match, false); copy = (DefineFunction) so.getObject(); // System.err.println(copy); } return copy; }
DelValueAction(int nTargetNode, String sValue) { try { m_nTargetNode = nTargetNode; m_sValue = sValue; m_att = m_Instances.attribute(nTargetNode); SerializedObject so = new SerializedObject(m_Distributions[nTargetNode]); m_CPT = (Estimator[]) so.getObject(); ; m_children = new FastVector(); for (int iNode = 0; iNode < getNrOfNodes(); iNode++) { if (m_ParentSets[iNode].contains(nTargetNode)) { m_children.addElement(iNode); } } m_childAtts = new Estimator[m_children.size()][]; for (int iChild = 0; iChild < m_children.size(); iChild++) { int nChild = (Integer) m_children.elementAt(iChild); m_childAtts[iChild] = m_Distributions[nChild]; } } catch (Exception e) { e.printStackTrace(); } }
/** * Gets the current flow being edited. The flow is returned as a single * Vector containing two other Vectors: the beans and the connections. * These two vectors are deep-copied via serialization before being * returned. * * @return the current flow being edited */ public Vector getFlow() throws Exception { Vector v = new Vector(); Vector beans = BeanInstance.getBeanInstances(m_mainKFPerspective.getCurrentTabIndex()); Vector connections = BeanConnection.getConnections(m_mainKFPerspective.getCurrentTabIndex()); detachFromLayout(beans); v.add(beans); v.add(connections); SerializedObject so = new SerializedObject(v); Vector copy = (Vector)so.getObject(); // tempWrite(beans, connections); integrateFlow(beans, connections, true, false); return copy; }
/** * Get a named DefineFunction. Returns a deep copy of the * function. * * @param functionName the name of the function to get * @return the named function or null if it cannot be found * @throws Exception if there is a problem deep copying the function */ protected DefineFunction getFunction(String functionName) throws Exception { DefineFunction copy = null; DefineFunction match = null; for (DefineFunction f : m_defineFunctions) { if (f.getName().equals(functionName)) { match = f; //System.err.println("Found a match!!!"); break; } } if (match != null) { SerializedObject so = new SerializedObject(match, false); copy = (DefineFunction)so.getObject(); //System.err.println(copy); } return copy; }
/** * Creates a given number of deep or shallow (if the kernel implements Copyable) * copies of the given kernel using serialization. * * @param model the kernel to copy * @param num the number of kernel copies to create. * @return an array of kernels. * @throws Exception if an error occurs */ public static Kernel[] makeCopies(Kernel model, int num) throws Exception { if (model == null) throw new Exception("No model kernel set"); Kernel[] kernels = new Kernel[num]; if (model instanceof Copyable) { for (int i = 0; i < kernels.length; i++) { kernels[i] = (Kernel) ((Copyable) model).copy(); } } else { SerializedObject so = new SerializedObject(model); for (int i = 0; i < kernels.length; i++) kernels[i] = (Kernel) so.getObject(); } return kernels; }
/** * Instantiates (by making a serialized copy) the supplied template meta bean * for display in the user tool bar * * @param bean the prototype MetaBean to display in the toolbar */ private JPanel instantiateToolBarMetaBean(MetaBean bean) { // copy the bean via serialization ((Visible) bean).getVisual().removePropertyChangeListener(this); bean.removePropertyChangeListenersSubFlow(this); Object copy = null; try { SerializedObject so = new SerializedObject(bean); copy = so.getObject(); } catch (Exception ex) { ex.printStackTrace(); return null; } ((Visible) bean).getVisual().addPropertyChangeListener(this); bean.addPropertyChangeListenersSubFlow(this); String displayName = ""; // if (copy instanceof Visible) { ((Visible) copy).getVisual().scale(3); displayName = ((Visible) copy).getVisual().getText(); } return makeHolderPanelForToolBarBean(displayName, copy, false, null, true); }
/** * Gets the current flow being edited. The flow is returned as a single Vector * containing two other Vectors: the beans and the connections. These two * vectors are deep-copied via serialization before being returned. * * @return the current flow being edited */ public Vector getFlow() throws Exception { Vector v = new Vector(); Vector beans = BeanInstance.getBeanInstances(); Vector connections = BeanConnection.getConnections(); detachFromLayout(beans); v.add(beans); v.add(connections); SerializedObject so = new SerializedObject(v); Vector copy = (Vector) so.getObject(); // tempWrite(beans, connections); integrateFlow(beans, connections); return copy; }
/** * Creates copies of the current clusterer. Note that this method now uses * Serialization to perform a deep copy, so the Clusterer object must be fully * Serializable. Any currently built model will now be copied as well. * * @param model an example clusterer to copy * @param num the number of clusterer copies to create. * @return an array of clusterers. * @exception Exception if an error occurs */ public static Clusterer[] makeCopies(Clusterer model, int num) throws Exception { if (model == null) { throw new Exception("No model clusterer set"); } Clusterer[] clusterers = new Clusterer[num]; SerializedObject so = new SerializedObject(model); for (int i = 0; i < clusterers.length; i++) { clusterers[i] = (Clusterer) so.getObject(); } return clusterers; }
/** * Creates copies of the current clusterer. Note that this method * now uses Serialization to perform a deep copy, so the Clusterer * object must be fully Serializable. Any currently built model will * now be copied as well. * * @param model an example clusterer to copy * @param num the number of clusterer copies to create. * @return an array of clusterers. * @exception Exception if an error occurs */ public static DensityBasedClusterer [] makeCopies(DensityBasedClusterer model, int num) throws Exception { if (model == null) { throw new Exception("No model clusterer set"); } DensityBasedClusterer [] clusterers = new DensityBasedClusterer [num]; SerializedObject so = new SerializedObject(model); for(int i = 0; i < clusterers.length; i++) { clusterers[i] = (DensityBasedClusterer) so.getObject(); } return clusterers; }
/** * Creates copies of the current evaluator. Note that this method now uses * Serialization to perform a deep copy, so the evaluator object must be fully * Serializable. Any currently built model will now be copied as well. * * @param model an example evaluator to copy * @param num the number of evaluator copies to create. * @return an array of evaluators. * @exception Exception if an error occurs */ public static ASEvaluation[] makeCopies(ASEvaluation model, int num) throws Exception { if (model == null) { throw new Exception("No model evaluator set"); } ASEvaluation[] evaluators = new ASEvaluation[num]; SerializedObject so = new SerializedObject(model); for (int i = 0; i < evaluators.length; i++) { evaluators[i] = (ASEvaluation) so.getObject(); } return evaluators; }
/** * Creates copies of the current search scheme. Note that this method * now uses Serialization to perform a deep copy, so the search * object must be fully Serializable. Any currently built model will * now be copied as well. * * @param model an example search scheme to copy * @param num the number of search scheme copies to create. * @return an array of search schemes. * @throws Exception if an error occurs */ public static ASSearch[] makeCopies(ASSearch model, int num) throws Exception { if (model == null) throw new Exception("No model search scheme set"); ASSearch[] result = new ASSearch[num]; SerializedObject so = new SerializedObject(model); for (int i = 0; i < result.length; i++) result[i] = (ASSearch) so.getObject(); return result; }
/** * returns deep copies of the given object * * @param obj the object to copy * @param num the number of copies * @return the deep copies * @throws Exception if copying fails */ protected Object[] makeCopies(Object obj, int num) throws Exception { if (obj == null) { throw new Exception("No object set"); } Object[] objs = new Object[num]; SerializedObject so = new SerializedObject(obj); for (int i = 0; i < objs.length; i++) { objs[i] = so.getObject(); } return objs; }
AddArcAction(int nParent, int nChild) { try { m_nParent = nParent; m_children = new ArrayList<Integer>(); m_children.add(nChild); // m_nChild = nChild; SerializedObject so = new SerializedObject(m_Distributions[nChild]); m_CPT = new Estimator[1][]; m_CPT[0] = (Estimator[]) so.getObject(); ; } catch (Exception e) { e.printStackTrace(); } }
AddArcAction(int nParent, ArrayList<Integer> children) { try { m_nParent = nParent; m_children = new ArrayList<Integer>(); m_CPT = new Estimator[children.size()][]; for (int iChild = 0; iChild < children.size(); iChild++) { int nChild = children.get(iChild); m_children.add(nChild); SerializedObject so = new SerializedObject(m_Distributions[nChild]); m_CPT[iChild] = (Estimator[]) so.getObject(); } } catch (Exception e) { e.printStackTrace(); } }
@Override public void undo() { try { for (int iChild = 0; iChild < m_children.size(); iChild++) { int nChild = m_children.get(iChild); deleteArc(m_nParent, nChild); SerializedObject so = new SerializedObject(m_CPT[iChild]); m_Distributions[nChild] = (Estimator[]) so.getObject(); } } catch (Exception e) { e.printStackTrace(); } }
DeleteArcAction(int nParent, int nChild) { try { m_nChild = nChild; m_nParent = nParent; m_nParents = new int[getNrOfParents(nChild)]; for (int iParent = 0; iParent < m_nParents.length; iParent++) { m_nParents[iParent] = getParent(nChild, iParent); } SerializedObject so = new SerializedObject(m_Distributions[nChild]); m_CPT = (Estimator[]) so.getObject(); } catch (Exception e) { e.printStackTrace(); } }
@Override public void undo() { try { SerializedObject so = new SerializedObject(m_CPT); m_Distributions[m_nChild] = (Estimator[]) so.getObject(); ParentSet parentSet = new ParentSet(); for (int m_nParent2 : m_nParents) { parentSet.addParent(m_nParent2, m_Instances); } m_ParentSets[m_nChild] = parentSet; } catch (Exception e) { e.printStackTrace(); } }
SetDistributionAction(int nTargetNode, double[][] P) { try { m_nTargetNode = nTargetNode; SerializedObject so = new SerializedObject(m_Distributions[nTargetNode]); m_CPT = (Estimator[]) so.getObject(); ; m_P = P; } catch (Exception e) { e.printStackTrace(); } }
@Override public void undo() { try { SerializedObject so = new SerializedObject(m_CPT); m_Distributions[m_nTargetNode] = (Estimator[]) so.getObject(); } catch (Exception e) { e.printStackTrace(); } }
void updateStatus() { a_undo.setEnabled(m_BayesNet.canUndo()); a_redo.setEnabled(m_BayesNet.canRedo()); a_datagenerator.setEnabled(m_BayesNet.getNrOfNodes() > 0); if (!m_bViewMargins && !m_bViewCliques) { repaint(); return; } try { m_marginCalculator = new MarginCalculator(); m_marginCalculator.calcMargins(m_BayesNet); SerializedObject so = new SerializedObject(m_marginCalculator); m_marginCalculatorWithEvidence = (MarginCalculator) so.getObject(); for (int iNode = 0; iNode < m_BayesNet.getNrOfNodes(); iNode++) { if (m_BayesNet.getEvidence(iNode) >= 0) { m_marginCalculatorWithEvidence.setEvidence(iNode, m_BayesNet.getEvidence(iNode)); } } for (int iNode = 0; iNode < m_BayesNet.getNrOfNodes(); iNode++) { m_BayesNet.setMargin(iNode, m_marginCalculatorWithEvidence.getMargin(iNode)); } } catch (Exception e) { e.printStackTrace(); } repaint(); }
/** * Creates a given number of deep copies of the given classifier using * serialization. * * @param model the classifier to copy * @param num the number of classifier copies to create. * @return an array of classifiers. * @exception Exception if an error occurs */ public static Classifier[] makeCopies(Classifier model, int num) throws Exception { if (model == null) { throw new Exception("No model classifier set"); } Classifier[] classifiers = new Classifier[num]; SerializedObject so = new SerializedObject(model); for (int i = 0; i < classifiers.length; i++) { classifiers[i] = (Classifier) so.getObject(); } return classifiers; }
/** * Executes the script without loading it first. * * @param file the script to execute * @param args the commandline parameters for the script */ public void run(File file, String[] args) { Script script; try { script = (Script) new SerializedObject(this).getObject(); script.m_Filename = file; script.m_Modified = false; script.start(args); } catch (Exception e) { e.printStackTrace(); } }
/** * sets the cell renderer and calcs the optimal column width */ private void setLayout() { ArffSortedTableModel arffModel; int i; JComboBox combo; Enumeration<Object> enm; arffModel = (ArffSortedTableModel) getModel(); for (i = 0; i < getColumnCount(); i++) { // optimal colwidths (only according to header!) JTableHelper.setOptimalHeaderWidth(this, i); // CellRenderer getColumnModel().getColumn(i) .setCellRenderer(new ArffTableCellRenderer()); // CellEditor if (i > 0) { if (arffModel.getType(i) == Attribute.NOMINAL) { combo = new JComboBox(); combo.addItem(null); enm = arffModel.getInstances().attribute(i - 1).enumerateValues(); while (enm.hasMoreElements()) { Object o = enm.nextElement(); if (o instanceof SerializedObject) { ((SerializedObject) o).getObject(); } combo.addItem(o); } getColumnModel().getColumn(i).setCellEditor( new DefaultCellEditor(combo)); } else { getColumnModel().getColumn(i).setCellEditor(null); } } } }
/** * Set the flow for the KnowledgeFlow to edit. Assumes that client has loaded * a Vector of beans and a Vector of connections. the supplied beans and * connections are deep-copied via serialization before being set in the * layout. The beans get added to the flow at position 0. * * @param v a Vector containing a Vector of beans and a Vector of connections * @exception Exception if something goes wrong */ @SuppressWarnings("unchecked") public void setFlow(Vector<Vector<?>> v) throws Exception { // Vector beansCopy = null, connectionsCopy = null; // clearLayout(); if (getAllowMultipleTabs()) { throw new Exception("[KnowledgeFlow] setFlow() - can only set a flow in " + "singe tab only mode"); } /* * int tabI = 0; * * BeanInstance. * removeAllBeansFromContainer((JComponent)m_mainKFPerspective. * getBeanLayout(tabI), tabI); BeanInstance.setBeanInstances(new Vector(), * m_mainKFPerspective.getBeanLayout(tabI)); * BeanConnection.setConnections(new Vector()); */ // m_mainKFPerspective.removeTab(0); // m_mainKFPerspective.addTab("Untitled"); m_beanLayout.removeAll(); BeanInstance.init(); BeanConnection.init(); SerializedObject so = new SerializedObject(v); Vector<Vector<?>> copy = (Vector<Vector<?>>) so.getObject(); Vector<Object> beans = (Vector<Object>) copy.elementAt(0); Vector<BeanConnection> connections = (Vector<BeanConnection>) copy .elementAt(1); // reset environment variables m_flowEnvironment = new Environment(); integrateFlow(beans, connections, true, false); revalidate(); notifyIsDirty(); }
public ExperimentRunner(final Experiment exp) throws Exception { // Create a full copy using serialization if (exp == null) { System.err.println("Null experiment!!!"); } else { System.err.println("Running experiment: " + exp.toString()); } System.err.println("Writing experiment copy"); SerializedObject so = new SerializedObject(exp); System.err.println("Reading experiment copy"); m_ExpCopy = (Experiment) so.getObject(); System.err.println("Made experiment copy"); }
/** * Makes a copy of an object using serialization * * @param source the object to copy * @return a copy of the source object */ protected Object copyObject(Object source) { Object result = null; try { SerializedObject so = new SerializedObject(source); result = so.getObject(); } catch (Exception ex) { System.err.println("AlgorithmListPanel: Problem copying object"); System.err.println(ex); } return result; }
/** * Creates a given number of deep copies of the given filter using * serialization. * * @param model the filter to copy * @param num the number of filter copies to create. * @return an array of filters. * @throws Exception if an error occurs */ public static Filter[] makeCopies(Filter model, int num) throws Exception { if (model == null) { throw new Exception("No model filter set"); } Filter[] filters = new Filter[num]; SerializedObject so = new SerializedObject(model); for (int i = 0; i < filters.length; i++) { filters[i] = (Filter) so.getObject(); } return filters; }
/** * Sets the instances to build the filtering model from. * * @param insts the Instances object */ public void setInstances(Instances data) { try { super.setInstances(data); // Apply user-specified filter Instances filteredData = new Instances(data); getFilter().setInputFormat(filteredData); filteredData = Filter.useFilter(data, getFilter()); if (data.numInstances() != filteredData.numInstances()) { throw new IllegalArgumentException( "FilteredNeighbourSearch: Filter has changed the number of instances!"); } // Set up filter to add ID m_IndexOfID = filteredData.numAttributes(); m_AddID.setIDIndex("" + (filteredData.numAttributes() + 1)); ; m_AddID.setInputFormat(filteredData); filteredData = Filter.useFilter(filteredData, m_AddID); // Modify distance function for base method to skip ID // User-specified range setting for the distance function is simply // ignored m_ModifiedSearchMethod = (NearestNeighbourSearch) new SerializedObject( getSearchMethod()).getObject(); m_ModifiedSearchMethod.getDistanceFunction().setAttributeIndices( "1-" + m_IndexOfID); m_ModifiedSearchMethod.getDistanceFunction().setInvertSelection(false); // Set up the distance function m_ModifiedSearchMethod.setInstances(filteredData); } catch (Exception e) { e.printStackTrace(); } }
/** * Creates a given number of deep copies of the given estimator using * serialization. * * @param model the estimator to copy * @param num the number of estimator copies to create. * @return an array of estimators. * @exception Exception if an error occurs */ public static Estimator[] makeCopies(Estimator model, int num) throws Exception { if (model == null) { throw new Exception("No model estimator set"); } Estimator[] estimators = new Estimator[num]; SerializedObject so = new SerializedObject(model); for (int i = 0; i < estimators.length; i++) { estimators[i] = (Estimator) so.getObject(); } return estimators; }
/** * Creates a given number of deep copies of the given classifier using serialization. * * @param model the classifier to copy * @param num the number of classifier copies to create. * @return an array of classifiers. * @exception Exception if an error occurs */ public static Classifier [] makeCopies(Classifier model, int num) throws Exception { if (model == null) { throw new Exception("No model classifier set"); } Classifier [] classifiers = new Classifier [num]; SerializedObject so = new SerializedObject(model); for(int i = 0; i < classifiers.length; i++) { classifiers[i] = (Classifier) so.getObject(); } return classifiers; }