Stone, P., and M. Veloso. 2000. Multiagent systems: A survey from a machine learn-
ing perspective. Autonomous Robots 8(3):345–383.
Swets, J. 1988. Measuring the accuracy of diagnostic systems. Science 240:
1285–1293.
Ting, K. M. 2002. An instance-weighting method to induce cost-sensitive trees. IEEE
Transactions on Knowledge and Data Engineering 14(3):659–665.
Ting, K. M., and I. H. Witten. 1997a. Stacked generalization: When does it work?
In Proceedings of the Fifteenth International Joint Conference on Artificial
Intelligence, Nagoya, Japan. San Francisco: Morgan Kaufmann, pp. 866–871.
———. 1997b. Stacking bagged and dagged models. In D. H. Fisher, editor,
Proceedings of the Fourteenth International Conference on Machine Learning,
Nashville, TN. San Francisco: Morgan Kaufmann, pp. 367–375.
Turney, P. D. 1999. Learning to extract key phrases from text. Technical Report ERB-
1057, Institute for Information Technology, National Research Council of
Canada, Ottawa, Canada.
U. S. House of Representatives Subcommittee on Aviation. 2002. Hearing on avia-
tion security with a focus on passenger profiling, February 27, 2002.
[http://www.house.gov/transpor tation/av iation/02-27-02/02-27-
02memo.html].
Vafaie, H., and K. DeJong. 1992. Genetic algorithms as a tool for feature selection
in machine learning. In Proceedings of the International Conference on Tools
with Artificial Intelligence. Arlington, VA: IEEE Computer Society Press, pp.
200–203.
van Rijsbergen, C. A. 1979. Information retrieval. London: Butterworths.
Vapnik, V. 1999. The nature of statistical learning theory, second edition. New York:
Springer-Verlag.
Wang, Y., and I. H. Witten. 1997. Induction of model trees for predicting continu-
ous classes. In M. van Someren and G. Widmer, editors, Proceedings of the
Poster Papers of the European Conference on Machine Learning. Prague: Uni-
versity of Economics, Faculty of Informatics and Statistics, pp. 128–137.
———. 2002. Modeling for optimal probability prediction. In C. Sammut and A.
Hoffmann, editors, Proceedings of the Nineteenth International Conference on
Machine Learning, Sydney, Australia. San Francisco: Morgan Kaufmann,
pp. 650–657.
Webb, G. I. 2000. MultiBoosting: A technique for combining boosting and wagging.
Machine Learning 40(2):159–196.
5 0 2
R E F E R E N C E S
P088407-REF.qxd 4/30/05 11:24 AM Page 502
Webb, G. I., J. Boughton, and Z. Wang. 2005. Not so Naïve Bayes: Aggregating one-
dependence estimators. Machine Learning 58(1):5–24.
Weiser, M. 1996. Open house. Review, the Web magazine of the Interactive
Telecommunications Program of New York University. March.
Weiser, M., and J. S. Brown. 1997. The coming age of calm technology. In P. J.
Denning and R. M. Metcalfe, editors, Beyond calculation: The next fifty years.
New York: Copernicus, pp. 75–86.
Weiss, S. M., and N. Indurkhya. 1998. Predictive data mining: A practical guide. San
Francisco: Morgan Kaufmann.
Wettschereck, D., and T. G. Dietterich. 1995. An experimental comparison of the
nearest-neighbor and nearest-hyperrectangle algorithms. Machine Learning
19(1):5–28.
Wild, C. J., and G. A. F. Seber. 1995. Introduction to probability and statistics. Depart-
ment of Statistics, University of Auckland, New Zealand.
Winston, P. H. 1992. Artificial intelligence. Reading, MA: Addison-Wesley.
Witten, I. H. 2004. Text mining. In M. P. Singh, editor, Practical handbook of
internet computing. Boca Raton, FL: CRC Press.
Witten, I. H., Z. Bray, M. Mahoui, and W. Teahan. 1999a. Text mining: A new fron-
tier for lossless compression. In J. A. Storer and M. Cohn, editors, Proceedings
of the Data Compression Conference, Snowbird, UT. Los Alamitos, CA: IEEE
Computer Society Press, pp. 198–207.
Witten, I. H., A. Moffat, and T. C. Bell. 1999b. Managing gigabytes: Compressing
and indexing documents and images, second edition. San Francisco: Morgan
Kaufmann.
Wolpert, D. H. 1992. Stacked generalization. Neural Networks 5:241–259.
Yang, Y., and G. I. Webb. 2001. Proportional k-interval discretization for Naïve Bayes
classifiers. In L. de Raedt and P. Flach, editors, Proceedings of the Twelfth
European Conference on Machine Learning, Freiburg, Germany. Berlin:
Springer-Verlag, pp. 564–575.
Yurcik, W., J. Barlow, Y. Zhou, H. Raje, Y. Li, X. Yin, M. Haberman, D. Cai, and D.
Searsmith. 2003. Scalable data management alternatives to support data mining
heterogeneous logs for computer network security. In Proceedings
of the Workshop on Data Mining for Counter Terrorism and Security, San
Francisco. Society for International and Applied Mathematics, Philadelphia, PA.
Zheng, Z., and G. Webb. 2000. Lazy learning of Bayesian rules. Machine Learning
41(1):53–84.
R E F E R E N C E S
5 0 3
P088407-REF.qxd 4/30/05 11:24 AM Page 503
P088407-REF.qxd 4/30/05 11:24 AM Page 504
A
activation function, 234
acuity, 258
AdaBoost, 328
AdaBoost.M1, 321, 416
Add, 395
AddCluster, 396, 397
AddExpression, 397
additive logistic regression, 327–328
additive regression, 325–327
AdditiveRegression, 416
AddNoise, 400
AD (all-dimensions) tree, 280–283
ADTree, 408
advanced methods. See implementation—
real-world schemes
adversarial data mining, 356–358
aggregation, appropriate degree in data
warehousing, 53
Akaike Information Criterion (AIC), 277
Alberta Ingenuity Centre for Machine
Learning, 38
algorithms
additive logistic regression, 327
advanced methods, 187–283. See also
implementation—real-world schemes
association rule mining, 112–119
bagging, 319
basic methods, 83–142. See also algorithms-
basic methods
Bayesian network learning, 277–283
clustering, 136–139
clustering in Weka, 418–419
covering, 105–112
decision tree induction, 97–105
divide-and-conquer, 107
EM, 265–266
expanding examples into partial tree, 208
filtering in Weka, 393–403. See also filtering
algorithms
incremental, 346
instance-based learning, 128–136
learning in Weka, 403–414. See also learning
algorithms
linear models, 119–128
metalearning in Weka, 414–418
1R method, 84–88
perceptron learning rule, 124
RIPPER rule learner, 206
rule formation-incremental reduced-error
pruning, 205
separate-and-conquer, 112
statistical modeling, 88–97
stochastic, 348
Winnow, 127
See also individual subject headings.
all-dimensions (AD) tree, 280–283
alternating decision tree, 329, 330, 343
Analyze panel, 443–445
analyzing purchasing patterns, 27
ancestor-of, 48
anomalies, 314–315
Index
5 0 5
P088407-INDEX.qxd 4/30/05 11:25 AM Page 505
Dostları ilə paylaş: |