Download Multilinear subspace learning dimensionality reduction of multidimensional data 1st edition

Page 1


https://ebookgate.com/product/multilinear-

More products digital (pdf, epub, mobi) instant download maybe you interests ...

Geometric Structure of High Dimensional Data and Dimensionality Reduction English Version Chinese Edition Wang Jian Zhong

https://ebookgate.com/product/geometric-structure-of-highdimensional-data-and-dimensionality-reduction-english-versionchinese-edition-wang-jian-zhong/

Hipparcos the New Reduction of the Raw Data

Astrophysics and Space Science Library 1st Edition

Floor Van Leeuwen

https://ebookgate.com/product/hipparcos-the-new-reduction-of-theraw-data-astrophysics-and-space-science-library-1st-editionfloor-van-leeuwen/

Learning Qlikview Data Visualization 1st Edition Karl Pover

https://ebookgate.com/product/learning-qlikview-datavisualization-1st-edition-karl-pover/

Scientific Inference Learning From Data 1st Edition

Simon Vaughan

https://ebookgate.com/product/scientific-inference-learning-fromdata-1st-edition-simon-vaughan/

Data Mining and Machine Learning in Cybersecurity 1st Edition Sumeet Dua

https://ebookgate.com/product/data-mining-and-machine-learningin-cybersecurity-1st-edition-sumeet-dua/

Multidimensional Poverty Measurement and Analysis 1st Edition Sabina Alkire

https://ebookgate.com/product/multidimensional-povertymeasurement-and-analysis-1st-edition-sabina-alkire/

Multidimensional

Real Analysis II Integration 1st Edition J.J. Duistermaat

https://ebookgate.com/product/multidimensional-real-analysis-iiintegration-1st-edition-j-j-duistermaat/

Unsaturated

Soil Mechanics 1st Edition Ning Lu

https://ebookgate.com/product/unsaturated-soil-mechanics-1stedition-ning-lu/

Epigenetics and Dermatology 1st Edition Qianjin Lu

https://ebookgate.com/product/epigenetics-and-dermatology-1stedition-qianjin-lu/

Machine Learning & Pattern Recognition Series

Multilinear Subspace Learning Dimensionality Reduction of Multidimensional Data

Konstantinos N. Plataniotis

Anastasios N. Venetsanopoulos

Haiping Lu

Multilinear

Subspace Learning

Dimensionality Reduction of Multidimensional Data

Chapman & Hall/CRC

Machine Learning & Pattern Recognition Series

SERIES EDITORS

Ralf Herbrich

Amazon Development Center

Berlin, Germany

AIMS AND SCOPE

Thore Graepel

Microsoft Research Ltd.

Cambridge, UK

This series re ects the latest advances and applications in machine learning and pattern recognition through the publication of a broad range of reference works, textbooks, and handbooks. The inclusion of concrete examples, applications, and methods is highly encouraged. The scope of the series includes, but is not limited to, titles in the areas of machine learning, pattern recognition, computational intelligence, robotics, computational/statistical learning theory, natural language processing, computer vision, game AI, game theory, neural networks, computational neuroscience, and other relevant topics, such as machine learning applied to bioinformatics or cognitive science, which might be proposed by potential contributors.

PUBLISHED TITLES

MACHINE LEARNING: An Algorithmic Perspective

Stephen Marsland

HANDBOOK OF NATURAL LANGUAGE PROCESSING, Second Edition

Nitin Indurkhya and Fred J. Damerau

UTILITY-BASED LEARNING FROM DATA

Craig Friedman and Sven Sandow

A FIRST COURSE IN MACHINE LEARNING

Simon Rogers and Mark Girolami

COST-SENSITIVE MACHINE LEARNING

Balaji Krishnapuram, Shipeng Yu, and Bharat Rao

ENSEMBLE METHODS: FOUNDATIONS AND ALGORITHMS

Zhi-Hua Zhou

MULTI-LABEL DIMENSIONALITY REDUCTION

Liang Sun, Shuiwang Ji, and Jieping Ye

BAYESIAN PROGRAMMING

Pierre Bessière, Emmanuel Mazer, Juan-Manuel Ahuactzin, and Kamel Mekhnacha

MULTILINEAR SUBSPACE LEARNING: DIMENSIONALITY REDUCTION OF MULTIDIMENSIONAL DATA

Haiping Lu, Konstantinos N. Plataniotis, and Anastasios N. Venetsanopoulos

Chapman & Hall/CRC

Machine Learning & Pattern Recognition Series

Multilinear

Subspace Learning

Dimensionality Reduction of Multidimensional Data

Haiping Lu

Konstantinos N. Plataniotis

Anastasios N. Venetsanopoulos

MATLAB® is a trademark of The MathWorks, Inc. and is used with permission. The MathWorks does not warrant the accuracy of the text or exercises in this book. This book’s use or discussion of MATLAB® software or related products does not constitute endorsement or sponsorship by The MathWorks of a particular pedagogical approach or particular use of the MATLAB® software.

CRC Press

Taylor & Francis Group

6000 Broken Sound Parkway NW, Suite 300

Boca Raton, FL 33487-2742

© 2014 by Taylor & Francis Group, LLC

CRC Press is an imprint of Taylor & Francis Group, an Informa business

No claim to original U.S. Government works

Version Date: 20131021

International Standard Book Number-13: 978-1-4398-5729-8 (eBook - PDF)

This book contains information obtained from authentic and highly regarded sources. Reasonable efforts have been made to publish reliable data and information, but the author and publisher cannot assume responsibility for the validity of all materials or the consequences of their use. The authors and publishers have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission to publish in this form has not been obtained. If any copyright material has not been acknowledged please write and let us know so we may rectify in any future reprint.

Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying, microfilming, and recording, or in any information storage or retrieval system, without written permission from the publishers.

For permission to photocopy or use material electronically from this work, please access www.copyright.com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-for-profit organization that provides licenses and registration for a variety of users. For organizations that have been granted a photocopy license by the CCC, a separate system of payment has been arranged.

Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe.

Visit the Taylor & Francis Web site at http://www.taylorandfrancis.com

and the CRC Press Web site at http://www.crcpress.com

3FundamentalsofMultilinearSubspaceLearning49

3.1MultilinearAlgebraPreliminaries. ..............50

3.1.1NotationsandDefinitions................50

3.1.2BasicOperations.....................53

3.1.3Tensor/MatrixDistanceMeasure............56

3.2TensorDecompositions.....................57

3.2.1CANDECOMP/PARAFAC...............57

3.2.2TuckerDecompositionandHOSVD..........58

3.3MultilinearProjections .....................59

3.3.1Vector-to-VectorProjection...............59

3.3.2Tensor-to-TensorProjection...............61

3.3.3Tensor-to-VectorProjection...............61

3.4RelationshipsamongMultilinearProjections .........63

3.5ScatterMeasuresforTensorsandScalars...........64

3.5.1Tensor-BasedScatters..................64

3.5.2Scalar-BasedScatters..................67 3.6Summary.............................68 3.7FurtherReading.........................69

4OverviewofMultilinearSubspaceLearning 71

4.1MultilinearSubspaceLearningFramework... .......72

4.2PCA-BasedMSLAlgorithms..................74

4.2.1PCA-BasedMSLthroughTTP.............74

4.2.2PCA-BasedMSLthroughTVP.............76

4.3LDA-BasedMSLAlgorithms..................76

4.3.1LDA-BasedMSLthroughTTP.............77

4.3.2LDA-BasedMSLthroughTVP.............77

4.4HistoryandRelatedWorks...................78

4.4.1HistoryofTensorDecompositions...........78

4.4.2NonnegativeMatrixandTensorFactorizations....79

4.4.3TensorMultipleFactorAnalysisandMultilinearGraphEmbedding........................80

4.5FutureResearchonMSL ....................81

4.5.1MSLAlgorithmDevelopment..............81

4.5.2MSLApplicationExploration..............84

4.6Summary.............................86

4.7FurtherReading.........................86

5AlgorithmicandComputationalAspects 89

5.1AlternatingPartialProjectionsforMSL............90 5.2Initialization...........................92

5.2.1PopularInitializationMethods.............92

5.2.2FullProjectionTruncation...............93

5.2.3InterpretationofMode-n Eigenvalues.........94

5.2.4AnalysisofFullProjectionTruncation.........95

5.3ProjectionOrder,Termination,andConvergence.......96

5.4SyntheticDataforAnalysisofMSLAlgorithms.......97

5.5FeatureSelectionforTTP-BasedMSL............99

5.5.1SupervisedFeatureSelection..............100

5.5.2UnsupervisedFeatureSelection.............101

5.6ComputationalAspects.....................101

5.6.1MemoryRequirementsandStorageNeeds.......101

5.6.2ComputationalComplexity...............102

5.6.3MATLAB R ImplementationTipsforLargeDatasets102 5.7Summary.............................103

6MultilinearPrincipalComponentAnalysis 107

6.1GeneralizedPCA........................108

6.1.1GPCAProblemFormulation..............108

6.1.2GPCAAlgorithmDerivation..............109

6.1.3DiscussionsonGPCA..................110

6.1.4ReconstructionErrorMinimization...........112

6.2MultilinearPCA .........................113

6.2.1MPCAProblemFormulation..............114

6.2.2MPCAAlgorithmDerivation..............114

6.2.3DiscussionsonMPCA..................116

6.2.4SubspaceDimensionDetermination ..........118

6.2.4.1SequentialModeTruncation.........119

6.2.4.2 Q-BasedMethod................119

6.3TensorRank-OneDecomposition................120

6.3.1TRODProblemFormulation..............120

6.3.2GreedyApproachforTROD..............121

6.3.3Solvingforthe pthEMP.................122

6.4UncorrelatedMultilinearPCA .................124

6.4.1UMPCAProblemFormulation.............124

6.4.2UMPCAAlgorithmDerivation.............125

6.4.3DiscussionsonUMPCA.................130

6.5BoostingwithMPCA......................131

6.5.1BenefitsofMPCA-BasedBooster............132

6.5.2LDA-StyleBoostingonMPCAFeatures........132

6.5.3ModifiedLDALearner..................134

6.6OtherMultilinearPCAExtensions ...............135

6.6.1Two-DimensionalPCA.................135

6.6.2GeneralizedLowRankApproximationofMatrices..136

6.6.3ConcurrentSubspaceAnalysis .............136

6.6.4MPCAplusLDA.....................137

6.6.5NonnegativeMPCA...................137

6.6.6RobustVersionsofMPCA................137

6.6.7IncrementalExtensionsofMPCA............138

6.6.8ProbabilisticExtensionsofMPCA ...........138

6.6.9WeightedMPCAandMPCAforBinaryTensors...139

7MultilinearDiscriminantAnalysis 141

7.1Two-DimensionalLDA.....................142

7.1.12DLDAProblemFormulation..............142

7.1.22DLDAAlgorithmDerivation..............143

7.2DiscriminantAnalysiswithTensorRepresentation ......145

7.2.1DATERProblemFormulation..............145

7.2.2DATERAlgorithmDerivation.............146

7.3GeneralTensorDiscriminantAnalysis.............147

7.4TensorRank-OneDiscriminantAnalysis...........150

7.4.1TR1DAProblemFormulation..............150

7.4.2Solvingforthe pthEMP.................151

7.5UncorrelatedMultilinearDiscriminantAnalysis .......153

7.5.1UMLDAProblemFormulation.............153

7.5.2R-UMLDAAlgorithmDerivation............154

7.5.3AggregationofR-UMLDALearners ..........160

7.6OtherMultilinearExtensionsofLDA .............162

7.6.1Graph-EmbeddingforDimensionalityReduction...162

7.6.2Graph-EmbeddingExtensionsofMultilinearDiscriminantAnalysis.......................163

7.6.3IncrementalandSparseMultilinearDiscriminantAnalysis............................164

8.1OverviewofMultilinearICAAlgorithms ...........166

8.1.1MultilinearApproachesforICAonVector-ValuedData166

8.1.2MultilinearApproachesforICAonTensor-ValuedData166

8.2MultilinearModewiseICA ...................167

8.2.1MultilinearMixingModelforTensors .........168

8.2.2RegularizedEstimationofMixingTensor .......168

8.2.3MMICAAlgorithmDerivation.............169

8.2.4ArchitecturesandDiscussionsonMMICA.......170

8.2.5BlindSourceSeparationonSyntheticData......171

8.3OverviewofMultilinearCCAAlgorithms ...........172

8.4Two-DimensionalCCA.....................173

8.4.12D-CCAProblemFormulation.............173

8.4.22D-CCAAlgorithmDerivation.............174

8.5MultilinearCCA .........................176

8.5.1MCCAProblemFormulation..............176

8.5.2MCCAAlgorithmDerivation..............178

8.5.3DiscussionsonMCCA..................184

8.6MultilinearPLSAlgorithms ..................184

8.6.1 N -WayPLS........................184

8.6.2Higher-OrderPLS....................185

9ApplicationsofMultilinearSubspaceLearning 189

9.1PatternRecognitionSystem..................190

9.2FaceRecognition.........................191

9.2.1AlgorithmsandTheirSettings.............192

9.2.2RecognitionResultsforSupervisedLearningAlgorithms...........................193

9.2.3RecognitionResultsforUnsupervisedLearningAlgorithms...........................194

9.3GaitRecognition.........................196

9.4VisualContentAnalysisinComputerVision.........198

9.4.1CrowdEventVisualizationandClustering.......198

9.4.2TargetTrackinginVideo................199

9.4.3Action,Scene,andObjectRecognition.........199

9.5BrainSignal/ImageProcessinginNeuroscience ........200

9.5.1EEGSignalAnalysis...................200

9.5.2fMRIImageAnalysis...................201

9.6DNASequenceDiscoveryinBioinformatics..........202

9.7MusicGenreClassificationinAudioSignalProcessing ....202

9.8DataStreamMonitoringinDataMining...........203

9.9OtherMSLApplications....................204

AppendixAMathematicalBackground

205

A.1LinearAlgebraPreliminaries..................205

A.1.1Transpose.........................205

A.1.2IdentityandInverseMatrices..............206

A.1.3LinearIndependenceandVectorSpaceBasis.....206

A.1.4ProductsofVectorsandMatrices............207

A.1.5VectorandMatrixNorms................209

A.1.6Trace...........................209

A.1.7Determinant.......................210

A.1.8EigenvaluesandEigenvectors..............211

A.1.9GeneralizedEigenvaluesandEigenvectors.......212

A.1.10SingularValueDecomposition..............212

A.1.11PowerMethodforEigenvalueComputation......213

A.2BasicProbabilityTheory....................213

A.2.1OneRandomVariable..................213

A.2.2TwoRandomVariables.................214

A.3BasicConstrainedOptimization................215

A.4BasicMatrixCalculus......................215

A.4.1BasicDerivativeRules..................215

A.4.2DerivativeofScalar/VectorwithRespecttoVector..216

A.4.3DerivativeofTracewithRespecttoMatrix......216

A.4.4DerivativeofDeterminantwithRespecttoMatrix..217

AppendixBDataandPreprocessing219

B.1FaceDatabasesandPreprocessing ...............219

B.1.1PIEDatabase.......................219

B.1.2FERETDatabase....................220

B.1.3PreprocessingofFaceImagesforRecognition .....220

B.2GaitDatabaseandPreprocessing ...............222

B.2.1USFGaitChallengeDatabase.............222

B.2.2GaitSilhouetteExtraction................224

B.2.3NormalizationofGaitSamples.............224

AppendixCSoftware 227

C.1SoftwareforMultilinearSubspaceLearning... .......227

C.2BenefitsofOpen-SourceSoftware...............228

C.3SoftwareDevelopmentTips...................228

ListofFigures

1.1Examplesofsecond-ordertensor(matrix)data. .......1

1.2Examplesofthird-ordertensordata..............3

1.3Examplesoffourth-ordertensordata..............4

1.4Athird-ordertensorformedbytheGaborfilteroutputsofa gray-levelfaceimage.......................5

1.5Afacecanberepresentedasonepointinanimagespaceof thesamesize...........................6

1.6Linearversusmultilinearmapping. ..............7

1.7Reshaping(vectorization)ofa32 × 32faceimagetoa1024 × 1vectorbreaksthenaturalstructureandcorrelationinthe originalfaceimage........................7

1.8Vector-basedversustensor-basedanalysisofa3Dobject...8

1.9Thefieldofmatrixcomputationsseemsto“kickup”itslevel ofthinkingaboutevery20years................10

1.10Illustrationofsecond-orderfeaturecharacteristics. ......11

1.11Illustrationofthird-orderfeaturecharacteristics.......12

2.1Linearsubspacelearningsolvesforaprojectionmatrix U, whichmapsahigh-dimensionalvector x toalow-dimensional vector y ..............................19

3.1Multilinearsubspacelearningfindsalower-dimensionalrepresentationbydirectmappingoftensorsthroughamultilinear projection.............................49

3.2Illustrationoftensorsoforder N =0, 1, 2, 3, 4.........50

3.3Illustrationofthemode-n vectors................52

3.4Illustrationofthemode-n slices. ................52

3.5Anexampleofsecond-orderrank-onetensor(thatis,rank-one matrix): A = u(1) ◦ u(2) = u(1) u(2)T ..............53

3.6Thediagonalofathird-ordercubicaltensor. .........53

3.7Visualillustrationofthemode-1unfolding..........54

3.8Visualillustrationofthemode-n (mode-1)multiplication..54

3.9TheCANDECOMP/PARAFACdecompositionofathirdordertensor............................58

3.10TheTuckerdecompositionofathird-ordertensor.......58

3.11Illustrationof(a)vector-to-vectorprojection,(b)tensor-totensorprojection,and(c)tensor-to-vectorprojection,where EMPstandsforelementarymultilinearprojection. .....60

3.12Illustrationofanelementarymultilinearprojection.. ....62

3.13Comparisonofthenumberofparameterstobeestimatedby VVP,TVP,andTTP,normalizedwithrespecttothenumber byVVPforvisualization....................65

4.1Themultilinearsubspacelearning(MSL)framework. ....71

4.2AtaxonomyofPCA-basedMSLalgorithms..........74

4.3AtaxonomyofLDA-basedMSLalgorithms..........77

4.4Overviewofthehistoryoftensordecompositionandmultilinearsubspacelearning......................78

5.1Typicalflowofamultilinearsubspacelearningalgorithm..89

5.2Visualinterpretationof(a)thetotalscattertensor,(b)the mode-1eigenvalues,(c)themode-2eigenvalues,and(d)the mode-3eigenvaluesoftherespectivemode-n totalscattermatrixforinputsamples. .....................95

5.3Plotsof(a)theeigenvaluemagnitudes,and(b)theircumulativedistributionsforsyntheticdatasetsdb1,db2,anddb3..98

5.4Plotsof(a)theeigenvaluemagnitudesand(b)theircumulativedistributionsforthegallerysetoftheUSFGaitdatabase V.1.7.(uptothirtyeigenvalues)................99

6.1MultilinearPCAalgorithmsundertheMSLframework...107

6.2IllustrationofrecognitionthroughLDA-styleboostingwith regularizationonMPCAfeatures................131

7.1MultilineardiscriminantanalysisalgorithmsundertheMSL framework.............................141

8.1MultilinearICA,CCA,andPLSalgorithmsundertheMSL framework.............................165

8.2Thestructureddatain(a)areallmixturesgeneratedfrom thesourcedatain(b)witha multilinearmixingmodel....167

8.3BlindsourceseparationbyMMICAonsyntheticdata....172

8.4SchematicofmultilinearCCAforpaired(second-order)tensor datasetswithtwoarchitectures.................178

9.1Atypicalpatternrecognitionsystem..............189

9.2Afaceimagerepresentedasasecond-ordertensor(matrix)of column × row ..........................192

9.3FacerecognitionresultsbysupervisedsubspacelearningalgorithmsonthePIEdatabase...................194

ListofFigures

9.4Facerecognitionresultsbyunsupervisedsubspacelearning algorithmsontheFERETdatabase..............195

9.5Agaitsilhouettesequenceasathird-ordertensorof column × row × time............................197

9.6Thecorrectrecognitionratesofsupervisedsubspacelearning algorithmsonthe32 × 22 × 10USFgaitdatabaseV.1.7...198

9.7Multichannelelectroencephalography(EEG)signalswith eachchannelasatimeseriesrecordedbyanelectrodeplaced onthescalp............................200

9.8Afunctionalmagneticresonanceimaging(fMRI)scansequencewiththreespatialmodesandonetemporalmode..201

9.9A2Dauditoryspectrogramrepresentationofmusicsignals.203

9.10Networktrafficdataorganizedasathird-ordertensorof sourceIP×destinationIP×portnumber............203

B.1Illustrationoffaceimagepreprocessing. ...........221

B.2SamplefaceimagesofonesubjectfromtheCMUPIE database..............................221

B.3ExamplesoffaceimagesfromtwosubjectsintheFERET database..............................222

B.4SampleframesfromtheGaitChallengedatasets.......223

B.5Illustrationofthesilhouetteextractionprocess. .......224

B.6ThreegaitsamplesfromtheUSFGaitdatabaseV.1.7,shown byconcatenatingframesinrows................225

ListofTables

2.1PCA,LDA,CCAandPLScanallbeviewedassolvingthe generalizedeigenvalueproblem Av = λBv ..........39

3.1Numberofparameterstobeestimatedbythreemultilinear projections............................65

4.1Linearversusmultilinearsubspacelearning. .........73

5.1Orderofcomputationalcomplexityofeigendecomposition formultilinearsubspacelearning(MSL)andlinearsubspace learning(LSL)..........................102

9.1Sixdistance(dissimilarity)measures d(a, b)betweenfeature vectors a ∈ RH and b ∈ RH ,withanoptionalweightvector w ∈ RH ..............................191

B.1CharacteristicsofthegaitdatafromtheUSFGaitChallenge datasetsversion1.7.......................224

ListofAlgorithms

2.1Principalcomponentanalysis(PCA)..............23

2.2Lineardiscriminantanalysis(LDA)...............30

2.3Canonicalcorrelationanalysis(CCA)..............35

2.4Nonlineariterativepartialleastsquares(NIPALS)......37

2.5PLS1regression..........................38

2.6Adaptiveboosting(AdaBoost).................44

5.1AtypicalTTP-basedmultilinearsubspacelearningalgorithm91

5.2AtypicalTVP-basedmultilinearsubspacelearningalgorithm92

6.1GeneralizedPCA(GPCA)....................111

6.2Multilinearprincipalcomponentanalysis(MPCA) ......117

6.3Tensorrank-onedecomposition(TROD)............123

6.4UncorrelatedmultilinearPCA(UMPCA) ...........127

6.5LDA-styleboosterbasedonMPCAfeatures..........133

7.1Two-dimensionalLDA(2DLDA)................145

7.2Discriminantanalysiswithtensorrepresentation(DATER).148

7.3Generaltensordiscriminantanalysis(GTDA).........149

7.4Tensorrank-onediscriminantanalysis(TR1DA).......152

7.5Regularizeduncorrelatedmultilineardiscriminantanalysis(RUMLDA).............................155

7.6RegularizedUMLDAwithaggregation(R-UMLDA-A) ....161

8.1MultilinearmodewiseICA(MMICA) ..............170

8.2Two-dimensionalCCA(2D-CCA)................175

8.3MultilinearCCAformatrixsets(ArchitectureI) .......183

8.4Tri-linearPLS1[N -wayPLS(N -PLS)].............186

8.5Higher-orderPLS(HOPLS)...................187

AcronymsandSymbols

AcronymDescription

AdaBoostAdaptiveboosting

ALSAlternatingleastsquares

APPAlternatingpartialprojections

BSSBlindsourceseparation

CANDECOMPCanonicaldecomposition

CCACanonicalcorrelationanalysis

CRRCorrectrecognitionrate

DATERDiscriminantanalysiswithtensorrepresentation

EMPElementarymultilinearprojection

FPTFullprojectiontruncation

GPCAGeneralizedPCA

GTDAGeneraltensordiscriminantanalysis

HOPLSHigher-orderPLS

HOSVDHigh-orderSVD

ICIndependentcomponent

ICAIndependentcomponentanalysis

LDALineardiscriminantanalysis

LSLLinearsubspacelearning

MCCAMultilinearCCA

MMICAMultilinearmodewiseICA

MPCAMultilinearPCA

MSLMultilinearsubspacelearning

NIPALSNonlineariterativepartialleastsquares

NMFNonnegativematrixfactorization

N -PLS

N -wayPLS

NTF Nonnegativetensorfactorization

PARAFACParallelfactors

PC Principalcomponent

PCA Principlecomponentanalysis

PLS Partialleastsquares

R-UMLDARegularizedUMLDA

R-UMLDA-ARegularizedUMLDAwithaggregation

SMT Sequentialmodetruncation

SSS Smallsamplesize

SVD Singularvaluedecomposition

SVM Supportvectormachine

TR1DATensorrank-onediscriminantanalysis

TRODTensorrank-onedecomposition

AcronymsandSymbols

AcronymDescription

TTPTensor-to-tensorprojection

TVPTensor-to-vectorprojection

UMLDAUncorrelatedmultilineardiscriminantanalysis

UMPCAUncorrelatedMPCA

VVPVector-to-vectorprojection

SymbolDescription

|A| Determinantofmatrix A

· F Frobeniusnorm

a or A Ascalar

a Avector

A Amatrix

A Atensor

A or A Themeanofsamples {Am } or {Am }

AT Transposeofmatrix A

A 1 Inverseofmatrix A

A(i1 ,i2 )Entryatthe i1 throwand i2 thcolumnof A

A(n) Mode-n unfoldingoftensor A

< A, B >Scalarproductof A and B

A×n U Mode-n productof A by U

a ◦ b Outer(tensor)productof a and b

C Numberofclasses

c Classindex

cm Classlabelforthe mth trainingsample,the mthelementoftheclassvector c

δpq Kroneckerdelta, δpq =1iff p = q and0otherwise

∂f (x)

∂ x Partialderivativeof f with respectto x

gp The pthcoordinatevector

gpm gp (m),the mthelementof

gp ,see ymp

Hy Numberofselectedfeatures inMSL

I Anidentitymatrix

In Mode-n dimensionormoden dimensionforthefirstset inCCA/PLSextensions

Jn Mode-n dimensionforthe secondsetinCCA/PLSextensions

K Maximumnumberofiterations

k Iterationstepindex

L Numberoftrainingsamples foreachclass

M Numberoftrainingsamples

m Indexoftrainingsample

Mc Numberoftrainingsamples inclass c

N Orderofatensor,numberof indices/modes

n Modeindexofatensor

P Dimensionoftheoutput vector,alsonumberofEMPs inaTVP,ornumberoflatentfactorsinPLS

Pn Mode-n dimensioninthe projected(output)spaceof aTTP

p Indexoftheoutputvector, alsoindexoftheEMPina TVP,orindexoflatentfactorinPLS

ΨB

Between-classscatter(measure)

ΨT Totalscatter(measure)

ΨW Within-classscatter(measure)

Q Ratiooftotalscatterkeptin eachmode

R

Thesetofrealnumbers

rm The(TVP)projectionof

thefirstsetsample Xm in second-orderMCCA

ρ SamplePearsoncorrelation

SB Between-classscattermatrix inLSL

S(n) B Mode-n between-classscattermatrixinMSL

SByp Between-classscatterof pth EMPprojections {ymp ,m = 1,...,M }

ST TotalscattermatrixinLSL

S(n) T Mode-n totalscattermatrix inMSL

STyp Totalscatterof pthEMP projections {ymp ,m = 1,...,M }

SW Within-classscattermatrix inLSL

S(n) W Mode-n within-classscatter matrixinMSL

SWyp Within-classscatterof the pthEMPprojections {ymp ,m =1,...,M }

sm The(TVP)projectionofthe secondsetsample Ym in second-orderMCCA

tr(A)Thetraceofmatrix A

Xm

The mthinputtensorsample

xm The mthinputvectorsampleorthe mthsampleinthe firstsetinCCA/PLS

U ProjectionmatrixinLSL

U or u The(sub)optimalsolutionof U or u

U(n) Mode-n projectionmatrix

{U(n) } ATTP,consistingof N projectionmatrices

u(n) Mode-n projectionvector

up The pthprojectionvectorin LSL,orthe pthmode-2projectionvectorintri-linear PLS1(N -PLS)

uxp The pthprojectionvector forthefirstsetinCCA/PLS,

orthe pthmode-1projectionvectorforthefirstsetin second-orderMCCA,orthe pthlatentvectorinHOPLS

uyp The pthprojectionvectorforthesecondsetin CCA/PLS,orthe pthmode1projectionvectorforthe secondsetinsecond-order MCCA

{u(n) p } The pthEMPinaTVP,consistingof N projectionvectors

{u(n) p }P N ATVP,consistingof P EMPs(P × N projection vectors)

vec(A)Vectorizedrepresentationof atensor A

vxp The pthmode-2projection vectorforthefirstsetin second-orderMCCA

vyp The pthmode-2projection vectorforthesecondsetin second-orderMCCA

wp The pthcoordinatevector forthefirstsetinCCA/PLS orsecond-orderMCCA,or the pthlatentfactorintrilinearPLS1(N -PLS)

Xm The mth(training/input) tensorsample

Ym Projectionof Xm onaTTP {U(n) },orthe mthsampleinthesecondsetin CCA/PLSextensions

´ Y (n)

ˆ

Y (n)

Mode-n partialmultilinear projectionofrawsamplesin TTP

Mode-n partialmultilinear projectionofcentered(zeromean)samplesinTTP

ym Vectorprojectionof Xm (rearrangedfromTTPprojection Ym inTTP-basedMSL orprojectiononaTVPin

´

y (n) p

ˆ

y (n) p

TVP-basedMSL),orthe mthsampleinthesecondset inCCA/PLS

Mode-n partialmultilinear projectionofrawsamplesin the pthEMPofaTVP

Mode-n partialmultilinear projectionofcentered(zero-

mean)samplesinthe pth EMPofaTVP

ymp = ym (p)= gp (m),projectionof Xm onthe pthEMP

{u(n) p }

zp The pthcoordinatevectorforthesecondsetin CCA/PLSorsecond-order MCCA

Preface

Withtheadvancesinsensor,storage,andnetworkingtechnologies,bigger andbiggerdataarebeinggeneratedonadailybasisinawiderangeofapplications,especiallyinemergingcloudcomputing,mobileInternet,andbig dataapplications.Mostreal-worlddata,eitherbigorsmall,havemultidimensionalrepresentations.Two-dimensional(2D)dataincludegray-levelimages incomputervisionandimageprocessing,multichannelelectroencephalography(EEG)signalsinneuroscienceandbiomedicalengineering,andgeneexpressiondatainbioinformatics.Three-dimensional(3D)datainclude3Dobjectsingenericobjectrecognition,hyperspectralcubeinremotesensing,and gray-levelvideosequencesinactivityorgesturerecognitionforsurveillance andhuman–computerinteraction.Afunctionalmagneticresonanceimaging (fMRI)sequenceinneuroimagingisanexampleoffour-dimensional(4D)data. Othermultidimensionaldataappearinmedicalimageanalysis,content-based retrieval,andspace-timesuper-resolution.Inaddition,manystreamingdata andminingdataarefrequentlyorganizedinmultidimensionalrepresentations, suchasthoseinsocialnetworkanalysis,Webdatamining,sensornetwork analysis,andnetworkforensics.Moreover,multiplefeatures(e.g.,different imagecues)canalsoformhigher-ordertensorsinfeaturefusion.

Thesemultidimensionaldataareusuallyveryhigh-dimensional,witha largeamountofredundancyandoccupyingonlyasmallsubspaceoftheentireinputspace.Therefore,dimensionalityreductionisfrequentlyemployed tomaphigh-dimensionaldatatoalow-dimensionalspacewhileretainingas muchinformationaspossible.Linearsubspacelearning(LSL)algorithmsare traditionaldimensionalityreductiontechniquesthatrepresentinputdataas vectorsandsolveforanoptimallinearmappingtoalower-dimensionalspace. However,theyoftenbecomeinadequatewhendealingwithbigmultidimensionaldata.Theyresultinveryhigh-dimensionalvectors,leadtotheestimationofalargenumberofparameters,andalsobreakthenaturalstructure andcorrelationintheoriginaldata.

Duetotheabovechallenges,especiallyinemergingbigdataapplications, therehasbeenanurgentneedformoreefficientdimensionalityreduction schemesforbigmultidimensionaldata.Consequently,therehasbeenagrowing interestinmultilinearsubspacelearning(MSL)thatreducesthedimensionalityofbigdatadirectlyfromtheirnaturalmultidimensionalrepresentation: tensors,whichrefertomultidimensionalarrayshere.TheresearchonMSL hasprogressedfromheuristicexplorationtosystematicinvestigation,while

recentprevalenceofbigdataapplicationshasincreasedthedemandfortechnicaldevelopmentsinthisemergingresearchfield.Thus,wefoundthatthere isastrongneedforanewbookdevotedtothefundamentalsandfoundations ofMSL,aswellasMSLalgorithmsandtheirapplications.

Theprimarygoalofthisbookistogiveacomprehensiveintroductionto boththeoreticalandpracticalaspectsofMSLfordimensionalityreductionof multidimensionaldata.ItexpectsnotonlytodetailrecentadvancesinMSL, butalsototracethehistoryandexplorefuturedevelopmentsandemerging applications.Inparticular,theemphasisisonthefundamentalconceptsand system-levelperspectives.Thisbookprovidesafoundationuponwhichwecan buildsolutionsformanyoftoday’smostinterestingandchallengingproblems inbigmultidimensionaldataprocessing.Specifically,itincludesthefollowingimportanttopicsinMSL:multilinearalgebrafundamentals,multilinear projections,MSLframeworkformulation,MSLoptimalitycriterionconstruction,andMSLalgorithms,solutions,andapplications.TheMSLframework enablesustodevelopMSLalgorithmssystematicallywithvariousoptimality criteria.UnderthisunifyingMSLframework,anumberofMSLalgorithms arediscussedandanalyzedindetail.Thisbookcoverstheirapplicationsin variousfields,andprovidestheirpseudocodesandimplementationtipstohelp practitionersinfurtherdevelopment,evaluation,andapplication.MATLAB R sourcecodesaremadeavailableonline.

Thetopicscoveredinthisbookareofgreatrelevanceandimportance toboththeoreticiansandpractitionerswhoareinterestedinlearningcompactfeaturesfrombigmultidimensionaldatainmachinelearningandpattern recognition.Mostexamplesgiveninthisbookhighlightourownexperiences, whicharedirectlyrelevantforresearcherswhoworkonapplicationsinvideo surveillance,biometrics,andobjectrecognition.Thisbookcanbeauseful referenceforresearchersdealingwithbigmultidimensionaldatainareassuch ascomputervision,imageprocessing,audioandspeechprocessing,machine learning,patternrecognition,datamining,remotesensing,neurotechnology, bioinformatics,andbiomedicalengineering.Itcanalsoserveasavaluableresourceforadvancedcoursesintheseareas.Inaddition,thisbookcanserve asagoodreferenceforgraduatestudentsandinstructorsinthedepartments ofelectricalengineering,computerengineering,computerscience,biomedical engineering,andbioinformaticswhoseorientationisinsubjectswheredimensionalityreductionofbigmultidimensionaldataisessential.

Weorganizethisbookintotwoparts.The“ingredients”areinPartIwhile the“dishes”areinPartII.Onthefirstpageofeachchapter,weincludea figureservingasa“graphicabstract”forthechapterwhereverpossible.

Insummary,thisbookprovidesafoundationforsolvingmanydimensionalityreductionproblemsinmultidimensionaldataapplications.Itisourhope thatitspublicationwillfostermoreprincipledandsuccessfulapplicationsof MSLinawiderangeofresearchdisciplines.

Wehavesetupthefollowingwebsitesforthisbook: http://www.comp.hkbu.edu.hk/~haiping/MSL.html

or http://www.dsp.toronto.edu/~haiping/MSL.html or https://sites.google.com/site/tensormsl/ Wewillupdatethesewebsiteswithopensourcesoftware,possiblecorrections, andanyotherusefulmaterialstodistributeafterpublicationofthisbook. TheauthorswouldliketothanktheEdwardS.RogersSr.Department ofElectricalandComputerEngineering,UniversityofToronto,forsupportingthisresearchwork.H.LuwouldliketothanktheInstituteforInfocomm Research,theAgencyforScience,TechnologyandResearch(A*STAR),in particular,How-LungEng,CuntaiGuan,Joo-HweeLim,andYiqunLi,for hostinghimforalmostfouryears.H.LuwouldalsoliketothanktheDepartmentofComputerScience,HongKongBaptistUniversity,inparticular, PongC.Yuen,andJimingLiuforsupportingthiswork.WethankDimitrios Hatzinakos,RaymondH.Kwong,andEmilM.Petriufortheirhelpinour workonthistopic.WethankKar-AnnToh,ConstantineKotropoulos,AndrewTeoh,andAltheaLiangforreadingthroughthedraftandofferinguseful commentsandsuggestions.Wewouldalsoliketothankthemanyanonymous reviewersofourpaperswhohavegivenustremendoushelpinadvancingthis field.Thisbookwouldnothavebeenpossiblewithoutthecontributionsfrom otherresearchersinthisfield.Inparticular,wewanttothankthefollowing researcherswhoseworkshavebeenparticularlyinspiringandhelpfultous: LievenDeLathauwer,TamaraG.Kolda,AmnonShashua,JianYang,Jieping Ye,XiaofeiHe,DengCai,DachengTao,ShuichengYan,DongXu,andXuelongLi.WealsothankeditorRandiCohenandthestaffatCRCPress,Taylor &FrancisGroup,fortheirsupportduringthewritingofthisbook.

HaipingLu HongKong

KonstantinosN.Plataniotis AnastasiosN.Venetsanopoulos Toronto

ForMATLAB R productinformation,pleasecontact:

Tel:508-647-7000

Fax:508-647-7001

E-mail:info@mathworks.com

Web:www.mathworks.com

Chapter1 Introduction

Withtheadvancesinsensor,storage,andnetworkingtechnologies,biggerand biggerdataarebeinggenerateddailyinawiderangeofapplications.Figures 1.1through1.4showsomeexamplesincomputervision,audioprocessing, neuroscience,remotesensing,anddatamining.Tosucceedinthiseraof big data [Howeetal.,2008],itbecomesmoreandmoreimportanttolearn compact features forefficientprocessing.Mostbigdataare multidimensional andthey canoftenberepresentedas multidimensionalarrays,whicharereferredto as tensors inmathematics[KoldaandBader,2009].Thus, tensor-basedcomputation isemerging,especiallywiththegrowthofmobileInternet[Lenhart etal.,2010],cloudcomputing[Armbrustetal.,2010],andbigdatasuchas theMapReducemodel[DeanandGhemawat,2008;Kangetal.,2012].

Thisbookdealswithtensor-basedlearningofcompactfeaturesfrommultidimensionaldata.Inparticular,wefocuson multilinearsubspacelearning (MSL)[Luetal.,2011],adimensionalityreduction[Burges,2010]methoddevelopedfortensordata.TheobjectiveofMSListolearna directmapping from high-dimensionaltensorrepresentationstolow-dimensionalvector/tensorrepresentations.

FIGURE1.1:Examplesofsecond-ordertensor(matrix)data:(a) agray-levelimage,(b)multichannelelectroencephalography(EEG) signals(“Electroencephalography,”Wikipedia,thefreeencyclopedia, http://en.wikipedia.org/wiki/Electroencephalography),(c)anauditoryspectrogram.

(a)
(b)
(c)

MultilinearSubspaceLearning

1.1TensorRepresentationofMultidimensionalData

Multidimensionaldatacanbenaturallyrepresentedasmultidimensional (multiway)arrays,whicharereferredtoas tensors inmathematics[Lang, 1984;KoldaandBader,2009].Thenumberofdimensions(ways) N defines the order ofatensor,andtheelements(entries)ofatensorareaddressedby N indices.Eachindexdefinesone mode.Tensorisageneralizationofvector andmatrix.Scalarsarezero-ordertensors,vectorsarefirst-ordertensors,matricesaresecond-ordertensors,andtensorsoforderthreeorhigher(N ≥ 3) arecalled higher-ordertensors [DeLathauweretal.,2000a;KoldaandBader, 2009].

Tensorterminology: Theterm tensor hasdifferentmeaningsinmathematicsandphysics.Theusageinthisbookreferstoitsmeaninginmathematics,inparticularmultilinearalgebra[DeLathauweretal.,2000b,a; Greub,1967;Lang,1984].Inphysics,thesametermgenerallyreferstoa tensorfield [LebedevandCloud,2003],ageneralizationofavectorfield. Itisanassociationofatensorwitheachpointofageometricspaceandit variescontinuouslywithposition.

Second-ordertensor(matrix)dataaretwo-dimensional(2D),withsome examplesshowninFigure1.1.Figure1.1(a)showsagray-levelfaceimage incomputervisionapplications,withspatialcolumnandrowmodes.Figure 1.1(b)depictsmultichannelelectroencephalography(EEG)signalsinneuroscience,wherethetwomodesconsistofchannelandtime.Figure1.1(c)shows anaudiospectrograminaudioandspeechprocessingwithfrequencyandtime modes.

Third-ordertensordataarethree-dimensional(3D),withsomeexamples showninFigure1.2.Figure1.2(a)isa3Dfaceobjectincomputervision orcomputergraphics[SahambiandKhorasani,2003],withthreemodesof (spatial)column,(spatial)row,anddepth.Figure1.2(b)showsahyperspectral cubeinremotesensing[RenardandBourennane,2009],withthreemodes ofcolumn,row,andspectralwavelength.Figure1.2(c)depictsabinarygait videosequenceforactivityorgesturerecognitionincomputervisionorhumancomputerinteraction(HCI)[Chellappaetal.,2005;GreenandGuan,2004], withthecolumn,row,andtimemodes.Figure1.2(d)illustratessocialnetwork analysisdataorganizedinthreemodesofconference,author,andkeyword [Sunetal.,2006].Figures1.2(e)and1.2(f)demonstratewebgraphminingdata organizedinthreemodesofsource,destination,andtext,andenvironmental sensormonitoringdataorganizedinthreemodesoftype,location,andtime [Faloutsosetal.,2007].

Similarly,fourth-ordertensordataarefour-dimensional(4D).Figure1.3(a)

Introduction 3 depictsafunctionalmagneticresonanceimaging(fMRI)scansequencein brainmappingresearch[vandeVenetal.,2004].Itisa4Dobjectwithfour modes:threespatialmodes(column,row,anddepth)andonetemporalmode. Anotherfourth-ordertensorexampleisnetworktrafficdatawithfourmodes: sourceIP,destinationIP,portnumber,andtime[KoldaandSun,2008],as illustratedinFigure1.3(b).

Ourtourthroughtensordataexamplesisnotmeanttobeexhaustive. Manyotherinterestingtensordatahaveappearedandareemerginginabroad spectrumofapplicationdomainsincludingcomputationalbiology,chemistry, physics,quantumcomputing,climatemodeling,andcontrolengineering[NSF, 2009].

Tensorforfeaturefusion: Moreover,multiplefeaturesofanimage(and

FIGURE1.2:Examplesofthird-ordertensordata:(a)a3Dfaceimage(Source:www.dirk.colbry.combyDr.DirkColbry),(b)ahyperspectralcube(“Hyperspectralimaging,”Wikipedia,thefreeencyclopedia, http://en.wikipedia.org/wiki/Hyperspectral imaging),(c)avideosequence, (d)socialnetworksorganizedinconference×author×keyword,(e)webgraphs organizedinsource×destination×text,(f)environmentalsensormonitoring dataorganizedintype×location×time.

(a)

otherdataaswell)canberepresentedasathird-ordertensorwherethefirst twomodesarecolumnandrow,andthethirdmodeindexesdifferentfeatures suchthattensorisusedasafeaturecombination/fusionscheme.Forexample, localdescriptorssuchastheScale-InvariantFeatureTransform(SIFT)[Lowe, 2004]andHistogramofOrientedGradients(HOG)[DalalandTriggs,2005] formalocaldescriptortensorin[Hanetal.,2012],whichisshowntobemore efficientthanthebag-of-feature(BOF)model[SivicandZisserman,2003]. Localbinarypatterns[Ojalaetal.,2002]onaGaussianpyramid[Lindeberg, 1994]areemployedtoformfeaturetensorsin[Ruiz-Hernandezetal.,2010a,b]. Gradient-basedappearancecuesarecombinedinatensorformin[Wangetal., 2011a],andwavelettransform[Antoninietal.,1992]andGaborfilters[Jain andFarrokhnia,1991]areusedtogeneratehigher-ordertensorsin[Lietal., 2009a;Barnathanetal.,2010],and[Taoetal.,2007b],respectively.Figure1.4

FIGURE1.3:Examplesoffourth-ordertensordata:(a)afunctionalmagneticresonanceimaging(fMRI)scansequencewiththreespatialmodesand onetemporalmode[Pantanoetal.,2005],(b)networktrafficdataorganized insourceIP×destinationIP×portnumber×time.

(a)

FIGURE1.4:Athird-ordertensorformedbytheGaborfilteroutputsofa gray-levelfaceimage.Here,tensorisusedasafeaturefusionscheme.

showsanexampleofathird-ordertensorformedbytheGaborfilteroutputs ofagray-levelfaceimage.

1.2DimensionalityReductionviaSubspaceLearning

Real-worldtensordataarecommonlyspecifiedinahigh-dimensionalspace. Directoperationonthisspacesuffersfromtheso-called curseofdimensionality :

• Handlinghigh-dimensionaldataputsahighdemandonprocessing powerandresourcessoitiscomputationallyexpensive[Shakhnarovich andMoghaddam,2004].

• Whenthenumberofdatasamplesavailableissmallcomparedtotheir highdimensionality,thatis,inthe smallsamplesize (SSS)scenario, conventionaltoolsbecomeinadequateandmanyproblemsbecomeillposedorpoorlyconditioned[Maetal.,2011].

Fortunately,thesetensordatadonotlierandomlyinthehigh-dimensional space;rather,theyarehighlyconstrainedandconfinedtoa subspace [ShakhnarovichandMoghaddam,2004;Zhangetal.,2004].Forexample,as showninFigure1.5,a256-levelfacialimageof100 × 100(ontheleft)isonly oneofthe25610,000 pointsinthecorrespondingimagespace.Asfacesare constrainedwithcertainspecificcharacteristics,all256-levelfacialimagesof 100 × 100willoccupyonlyaverysmallportion,thatis,asubspace,ofthe correspondingimagespace.Thus,theyareintrinsicallylow-dimensionaland thereislotsofredundancy.

FIGURE1.5:Afacecanberepresentedasonepointinanimagespaceof thesamesize.All8-bit100 × 100facesoccupyonlyasmallportion,thatis, asubspace,ofthewholeimagespaceofsize100 × 100with8bitsperpixel.

Dimensionalityreduction 1 isanattempttotransformahigh-dimensional datasetintoalow-dimensionalrepresentationwhileretainingmostoftheinformationregardingtheunderlyingstructureortheactualphysicalphenomenon [LawandJain,2006].Inotherwords,indimensionalityreduction,wearelearninga mapping fromhigh-dimensionalinputspacetolow-dimensionaloutput spacethatisasubspaceoftheinputspace,thatis,wearedoing subspace learning.Wecanviewthelow-dimensionalrepresentationas latentvariables toestimate.Also,wecanviewthisasa featureextraction processandthelowdimensionalrepresentationasthefeatureslearned.Thesefeaturescanthen beusedtoperformvarioustasks,forexample,theycanbefedintoaclassifier toidentifyitsclasslabel.

“Inaninformation-richworld,thewealthofinformation meansadearthofsomethingelse:ascarcityofwhateveritisthatinformationconsumes.Whatinformation consumesisratherobvious:itconsumestheattentionof itsrecipients.Henceawealthofinformationcreatesa povertyofattentionandaneedtoallocatethatattentionefficientlyamongtheoverabundanceofinformation sourcesthatmightconsumeit.”

HerbertSimon(1916–2001) Economist,TuringAwardWinner,andNobelLaureate

Traditionalsubspacelearningalgorithmsarelinearonesoperatingonvectors,thatis,first-ordertensors,includingprincipalcomponentanalysis(PCA) [Jolliffe,2002],independentcomponentanalysis(ICA)[Hyv¨arinenetal.,

1 Dimensionalityreductionisalsoknownas dimensionreduction or dimensionalreduction [Burges,2010].Here,weadoptthenamemostcommonlyknowninthemachine learningandpatternrecognitionliterature.

Another random document with no related content on Scribd:

HEAD.—Closely resembling the bloodhound’s; long, narrow, heavy flews; occiput prominent; forehead wrinkled to the eyes, which should be kind and show the haw. Teeth small, and the protruding of the upper jaw is nota fault. Ears so long that in hunting the dog treads on them, set low, hang loose in folds, ends curl inward, thin and velvety.

NECK.—Powerful, with heavy dewlaps; elbows mustnot turn out; chest deep and full; body long and low.

LEGS AND FEET.—Fore legs short (about 4 inches), closefitting to chest; massive paw, each toe standing out distinctly.

STIFLES.—Well bent; quarters muscular, giving the dog a barrel-like shape and a peculiar waddling gait.

STERN.—Coarse underneath, and carried hound fashion, i.e., carried gaily.

COAT.—Short, smooth, fine, and glossy; skin loose and elastic.

COLOR.—Black, white and tan, with black patches on back; also sometimeshare-pied.

WEIGHT.—Thirty to forty-five pounds.

THE HOUND (BEAGLE).

H. L. Kreuder’s, Nanuet, N. Y.

FRANK FOREST.

ORIGIN.—This breed seems to be little else than a diminutive foxhound; has long been in existence; probably one of the oldest of British dogs.

USES.—Hunting rabbits, and generally run in packs of five to ten couples; they are merry little fellows, sturdy and gamy, with a most musical tongue and a very keen nose.

* SCALE OF POINTS, ETC.

HEAD. Skull moderately domed. Ears set on low, long and fine in leather, rather broad and rounded at tips, absence of all erectile power. Eyes full, prominent, rather wide apart, soft and lustrous. Muzzle medium length, squarely cut; stop well defined; jaws level; lips either free from or with moderate flews; nostrils large.

NECK AND THROAT.—Neck free in action, strong, yet not loaded; throat clean, free from folds of skin.

SHOULDERS AND CHEST.—Shoulders somewhat sloping, muscular, but not loaded; chest moderately broad and full.

BACK, LOINS, AND RIBS.—Back short and strong; loins broad and slightly arched; ribs well sprung.

FORE LEGS AND FEET.—Fore legs straight, plenty of bone; feet close, firm, either round or hare-like.

HIPS, THIGHS, AND HIND LEGS.—Hips muscular; stifles strong and well let down; hocks firm.

TAIL.—Carried gaily, well up, medium curve, and clothed with a decided brush.

HEIGHT.—Fifteen inches.

COLOR. All hound colors admissible. (See Foxhound.)

DEFECTS.—Flat skull; short ears, set on too high, pointed at tips; eyes yellow or light color; muzzle snipy; thick, short neck; elbows out; knees knuckled over; long tail with “tea-pot” curve.

DISQUALIFICATIONS.—Eyes close together and terrier-like; thin rat-tail, with absence of brush; short, nappy coat.

THE HOUND (BLOODHOUND).

J. L. Winchell’s, Fair Haven, Vt.

CHAMPION VICTOR.

ORIGIN.—In Barbour’s “Bruce” (1489) we find the earliest mention of the bloodhound, where it is called the “sleuthhund.” However, little can be learned definitely of its origin.

USES.—Having scenting powers to a marvelous degree, it is used in trailing wounded deer, slaves, sheep-stealers, escaped convicts, etc.

DISPOSITION.—Contrary to general impressions, the modern bloodhound is of a most equable disposition, kind and gentle, and quite apt to be timid, excepting when on the trail; then it is extremely dangerous.

SCALE OF POINTS, ETC.

HEAD.—This is the most distinguishable feature of the dog; it is domed, blunt at occiput; jaws very long and wide at nostrils, hollow and very lean at cheek; brows very prominent, and the general expression is grand and majestic; skin covering cheeks and forehead wrinkled to a wonderful degree.

EYES AND EARS.—Eyes hazel, rather small, deeply sunk, showing haw, which is deep red. This redness, some claim, is indicative of cross with mastiff, Gordon setter, or St. Bernard. Ears long, and will overlap when drawn over front of nose, hang close to cheek, never inclined to be pricked; leather thin, covered with soft hair.

FLEWS.—Very long and pendent, falling below mouth.

NECK.—Long, so as to enable the dog to easily drop his nose to the ground; considerable dewlap.

CHEST AND SHOULDERS.—Chest wider than deep; shoulders sloping and muscular.

BACK AND BACK RIBS.—Wide and deep, the hips being wide or almost ragged.

LEGS AND FEET.—Legs must be straight and muscular; feet as catlike as possible.

COAT.—Short and hard on body, silky on ears and top of head.

COLOR.—Black and tan or tan only; the black extends to the back, sides, top of neck, and top of head; the tan should be of deep, rich red; there should be little or no white.

STERN.—Carried gaily in gentle curve, but not raised above back; lower side is fringed with hair.

DEFECT.—Absence of black.

THE HOUND (DACHSHUND).

J. H. Snow’s, Philadelphia, Pa.

RITZ.

ORIGIN. The origin of this dog is lost in antiquity. A dog resembling it very closely is to be found on the monument of Thothmes III., 2000 B.C. The modern dog is essentially German.

USES.—Hunting rabbits and hares, tracking wounded animals and badgers.

* SCALE OF POINTS, ETC. Value.

F

HEAD AND SKULL.—Long, level, narrow; peak well developed; no stop. Eyes intelligent and rather small; follow body in color. Ears long, broad, soft, set on low and well back, carried close to head. Jaws strong, level, square to the muzzle; canines recurvant.

CHEST.—Deep, narrow; breast-bone prominent.

LEGS AND FEET.—Fore legs very short, strong in bone, well crooked, not standing over; elbows well muscled, neither in nor out; feet large, round, strong, with thick pads and strong nails. Hind legs smaller in bone and higher; feet smaller. The dog must stand equally on all parts of the foot.

SKIN AND COAT.—Skin thick, loose, supple, and in great quantity; coat dense, short, and strong.

LOINS.—Well arched, long, and muscular.

STERN.—Long and strong, flat at root, tapering to tip; hair on under side coarse; carried low except when excited.

BODY.—Length from back of head to root of tail two and a half times height at shoulder; fore ribs well sprung; back ribs very short.

COLOR. Any color; nose to follow body color; much white objectionable.

SYMMETRY AND QUALITY.—The dachshund should be long, low, and graceful, not cloddy.

WEIGHT.—Dogs, 21 pounds; bitches, 18 pounds.

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.