楼主: Lisrelchen
2016 8

Java-ML:Java Machine Learning Library [推广有奖]

  • 0关注
  • 62粉丝

VIP

已卖:4192份资源

院士

67%

还不是VIP/贵宾

-

TA的文库  其他...

Bayesian NewOccidental

Spatial Data Analysis

东西方数据挖掘

威望
0
论坛币
50278 个
通用积分
83.5106
学术水平
253 点
热心指数
300 点
信用等级
208 点
经验
41518 点
帖子
3256
精华
14
在线时间
766 小时
注册时间
2006-5-4
最后登录
2022-11-6

楼主
Lisrelchen 发表于 2016-6-27 02:32:58 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
Documentation





二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:Learning machine earning Library BRARY Java

本帖被以下文库推荐

沙发
Lisrelchen 发表于 2016-6-27 02:34:58
  1. /**
  2. * This file is part of the Java Machine Learning Library
  3. *
  4. * The Java Machine Learning Library is free software; you can redistribute it and/or modify
  5. * it under the terms of the GNU General Public License as published by
  6. * the Free Software Foundation; either version 2 of the License, or
  7. * (at your option) any later version.
  8. *
  9. * The Java Machine Learning Library is distributed in the hope that it will be useful,
  10. * but WITHOUT ANY WARRANTY; without even the implied warranty of
  11. * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
  12. * GNU General Public License for more details.
  13. *
  14. * You should have received a copy of the GNU General Public License
  15. * along with the Java Machine Learning Library; if not, write to the Free Software
  16. * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
  17. *
  18. * Copyright (c) 2006-2009, Thomas Abeel
  19. *
  20. * Project: http://java-ml.sourceforge.net/
  21. *
  22. */
  23. package tutorials.clustering;

  24. import java.io.File;

  25. import net.sf.javaml.clustering.Clusterer;
  26. import net.sf.javaml.clustering.KMeans;
  27. import net.sf.javaml.core.Dataset;
  28. import net.sf.javaml.tools.data.FileHandler;

  29. /**
  30. * This tutorial shows how to use a clustering algorithm to cluster a data set.
  31. *
  32. *
  33. * @author Thomas Abeel
  34. *
  35. */
  36. public class TutorialKMeans {

  37.     /**
  38.      * Tests the k-means algorithm with default parameter settings.
  39.      */
  40.     public static void main(String[] args) throws Exception {

  41.         /* Load a dataset */
  42.         Dataset data = FileHandler.loadDataset(new File("devtools/data/iris.data"), 4, ",");
  43.         /*
  44.          * Create a new instance of the KMeans algorithm, with no options
  45.          * specified. By default this will generate 4 clusters.
  46.          */
  47.         Clusterer km = new KMeans();
  48.         /*
  49.          * Cluster the data, it will be returned as an array of data sets, with
  50.          * each dataset representing a cluster
  51.          */
  52.         Dataset[] clusters = km.cluster(data);
  53.         System.out.println("Cluster count: " + clusters.length);

  54.     }

  55. }
复制代码

藤椅
Lisrelchen 发表于 2016-6-27 02:36:00
  1. **
  2. * This file is part of the Java Machine Learning Library
  3. *
  4. * The Java Machine Learning Library is free software; you can redistribute it and/or modify
  5. * it under the terms of the GNU General Public License as published by
  6. * the Free Software Foundation; either version 2 of the License, or
  7. * (at your option) any later version.
  8. *
  9. * The Java Machine Learning Library is distributed in the hope that it will be useful,
  10. * but WITHOUT ANY WARRANTY; without even the implied warranty of
  11. * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
  12. * GNU General Public License for more details.
  13. *
  14. * You should have received a copy of the GNU General Public License
  15. * along with the Java Machine Learning Library; if not, write to the Free Software
  16. * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
  17. *
  18. * Copyright (c) 2006-2009, Thomas Abeel
  19. *
  20. * Project: http://java-ml.sourceforge.net/
  21. *
  22. */
  23. package tutorials.clustering;

  24. import java.io.File;

  25. import net.sf.javaml.clustering.Clusterer;
  26. import net.sf.javaml.clustering.KMeans;
  27. import net.sf.javaml.clustering.evaluation.AICScore;
  28. import net.sf.javaml.clustering.evaluation.BICScore;
  29. import net.sf.javaml.clustering.evaluation.ClusterEvaluation;
  30. import net.sf.javaml.clustering.evaluation.SumOfSquaredErrors;
  31. import net.sf.javaml.core.Dataset;
  32. import net.sf.javaml.tools.data.FileHandler;

  33. /**
  34. * Shows how to use the different cluster evaluation measure that are
  35. * implemented in Java-ML.
  36. *
  37. * @see net.sf.javaml.clustering.evaluation.*
  38. *
  39. * @author Thomas Abeel
  40. *
  41. */
  42. public class TutorialClusterEvaluation {

  43.     public static void main(String[] args) throws Exception {
  44.         /* Load a dataset */
  45.         Dataset data = FileHandler.loadDataset(new File("devtools/data/iris.data"), 4, ",");
  46.         /*
  47.          * Create a new instance of the KMeans algorithm that will create 3
  48.          * clusters and create one that will make 4 clusters.
  49.          */
  50.         Clusterer km3 = new KMeans(3);
  51.         Clusterer km4 = new KMeans(4);
  52.         /*
  53.          * Cluster the data, we will create 3 and 4 clusters.
  54.          */
  55.         Dataset[] clusters3 = km3.cluster(data);
  56.         Dataset[] clusters4 = km4.cluster(data);

  57.         ClusterEvaluation aic = new AICScore();
  58.         ClusterEvaluation bic = new BICScore();
  59.         ClusterEvaluation sse = new SumOfSquaredErrors();

  60.         double aicScore3 = aic.score(clusters3);
  61.         double bicScore3 = bic.score(clusters3);
  62.         double sseScore3 = sse.score(clusters3);

  63.         double aicScore4 = aic.score(clusters4);
  64.         double bicScore4 = bic.score(clusters4);
  65.         double sseScore4 = sse.score(clusters4);

  66.         System.out.println("AIC score: " + aicScore3+"\t"+aicScore4);
  67.         System.out.println("BIC score: " + bicScore3+"\t"+bicScore4);
  68.         System.out.println("Sum of squared errors: " + sseScore3+"\t"+sseScore4);
  69.         
  70.     }
  71. }
复制代码

板凳
Lisrelchen 发表于 2016-6-27 02:36:41
  1. /**
  2. * This file is part of the Java Machine Learning Library
  3. *
  4. * The Java Machine Learning Library is free software; you can redistribute it and/or modify
  5. * it under the terms of the GNU General Public License as published by
  6. * the Free Software Foundation; either version 2 of the License, or
  7. * (at your option) any later version.
  8. *
  9. * The Java Machine Learning Library is distributed in the hope that it will be useful,
  10. * but WITHOUT ANY WARRANTY; without even the implied warranty of
  11. * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
  12. * GNU General Public License for more details.
  13. *
  14. * You should have received a copy of the GNU General Public License
  15. * along with the Java Machine Learning Library; if not, write to the Free Software
  16. * Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA  02110-1301  USA
  17. *
  18. * Copyright (c) 2006-2009, Thomas Abeel
  19. *
  20. * Project: http://java-ml.sourceforge.net/
  21. *
  22. */
  23. package tutorials.tools;

  24. import java.io.File;

  25. import net.sf.javaml.clustering.Clusterer;
  26. import net.sf.javaml.core.Dataset;
  27. import net.sf.javaml.tools.data.FileHandler;
  28. import net.sf.javaml.tools.weka.WekaClusterer;
  29. import weka.clusterers.XMeans;

  30. /**
  31. * Tutorial how to use a Weka classifier in Java-ML.
  32. *
  33. * @author Thomas Abeel
  34. *
  35. */
  36. public class TutorialWekaClusterer {

  37.     public static void main(String[] args) throws Exception {
  38.         /* Load data */
  39.         Dataset data = FileHandler.loadDataset(new File("devtools/data/iris.data"), 4, ",");
  40.         /* Create Weka classifier */
  41.         XMeans xm = new XMeans();
  42.         /* Wrap Weka clusterer in bridge */
  43.         Clusterer jmlxm = new WekaClusterer(xm);
  44.         /* Perform clustering */
  45.         Dataset[] clusters = jmlxm.cluster(data);
  46.         /* Output results */
  47.         System.out.println(clusters.length);
  48.     }
  49. }
复制代码

报纸
Lisrelchen 发表于 2016-6-27 02:38:02
Feature scoring
  1. /* Load the iris data set */
  2. Dataset data = FileHandler.loadDataset(new File("iris.data"), 4, ",");
  3. /* Create a feature scoring algorithm */
  4. GainRatio ga = new GainRatio();
  5. /* Apply the algorithm to the data set */
  6. ga.build(data);
  7. /* Print out the score of each attribute */
  8. for (int i = 0; i < ga.noAttributes(); i++)
  9.     System.out.println(ga.score(i));
复制代码

地板
Lisrelchen 发表于 2016-6-27 02:39:50
Feature Ranking
  1. /* Load the iris data set */
  2. Dataset data = FileHandler.loadDataset(new File("iris.data"), 4, ",");
  3. /* Create a feature ranking algorithm */
  4. RecursiveFeatureEliminationSVM svmrfe = new RecursiveFeatureEliminationSVM(0.2);
  5. /* Apply the algorithm to the data set */
  6. svmrfe.build(data);
  7. /* Print out the rank of each attribute */
  8. for (int i = 0; i < svmrfe.noAttributes(); i++)
  9.     System.out.println(svmrfe.rank(i));
复制代码

7
Lisrelchen 发表于 2016-6-27 02:41:04
Feature subset selection
Thu, 01/15/2009 - 15:47 — Thomas Abeel
Subset selection algorithms differ with the scoring and ranking methods in that they only provide a set of features that are selected without further information on the quality of each feature individually.
Subset selection algorithms provide the method
  1. /* Load the iris data set */
  2. Dataset data = FileHandler.loadDataset(new File("iris.data"), 4, ",");
  3. /* Construct a greedy forward subset selector */
  4. GreedyForwardSelection ga = new GreedyForwardSelection(1, new PearsonCorrelationCoefficient());
  5. /* Apply the algorithm to the data set */
  6. ga.build(data);
  7. /* Print out the attribute that has been selected */
  8. System.out.println(ga.selectedAttributes());
复制代码

8
Lisrelchen 发表于 2016-6-27 02:50:20
Ensemble feature ranking[size=0.92em]Thu, 01/15/2009 - 15:45 — Thomas Abeel

The ensemble feature ranking algorithm is another form of feature ranking and as such provides the following method to determine the rank of a feature. Lower ranks are better.

  1. /* Load the iris data set */
  2. Dataset data = FileHandler.loadDataset(new File("devtools/data/iris.data"), 4, ",");
  3. /* Create a feature ranking algorithm */
  4. RecursiveFeatureEliminationSVM[] svmrfes = new RecursiveFeatureEliminationSVM[10];
  5. for (int i = 0; i < svmrfes.length; i++)
  6.     svmrfes[i] = new RecursiveFeatureEliminationSVM(0.2);
  7. LinearRankingEnsemble ensemble = new LinearRankingEnsemble(svmrfes);
  8. /* Build the ensemble */
  9. ensemble.build(data);
  10. /* Get rank of i-th feature */
  11. int rank=ensemble.rank(i)
复制代码





9
cheeko 在职认证  发表于 2016-6-27 07:18:25
谢谢分享

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
jg-xs1
拉您进交流群
GMT+8, 2025-12-16 11:54