Clinicalapplicationsof artificialneuralnetworks
Editedby RichardDybowski King’sCollegeLondon and VanyaGant
UniversityCollegeLondonHospitalsNHSTrust
CAMBRIDGEUNIVERSITYPRESS
Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo
Cambridge University Press
The Edinburgh Building, Cambridge CB28RU, UK
Published in the United States of America by Cambridge University Press, New York www.cambridge.org
Information on this title: www.cambridge.org/9780521001335
This publication is in copyright. Subject to statutory exception and to the provision of relevant collective licensing agreements, no reproduction of any part may take place without the written permission of Cambridge University Press.
First published in print format
© Cambridge University Press 2001 2007
ISBN-13978-0-511-34270-7
ISBN-10 0-511-34270-5
ISBN-13 978-0-521-00133-5
ISBN-10 0-521-00133-1
eBook (NetLibrary)
eBook (NetLibrary)
paperback
paperback
Cambridge University Press has no responsibility for the persistence or accuracy of urls for external or third-party internet websites referred to in this publication, and does not guarantee that any content on such websites is, or will remain, accurate or appropriate.
LionelTarassenko,MayelaZamoraandJamesPardey
CraigS.NiederbergerandRichardM.Golden
9Adaptiveresonancetheory:afoundationfor‘apprentice’systemsin clinicaldecisionsupport?
RobertF.Harrison,SimonS.Cross,R.LeeKennedy,CheePengLim andJosephDowns 10EvolvingartiWcialneuralnetworks
V.WilliamPortoandDavidB.Fogel
BrianD.Ripleyand RuthM.Ripley
12AreviewoftechniquesforextractingrulesfromtrainedartiWcialneural networks
RobertAndrews,AlanB.TickleandJoachimDiederich
13ConWdenceintervalsandpredictionintervalsforfeedforwardneural networks
RichardDybowskiandStephenJ.Roberts
VanyaGant,SusanRodwayandJeremyWyatt
Contributors
CharlesW.Anderson DeptofComputerScience
ColoradoStateUniversity
Fort Collins,CO80523-1873
USA
RobertAndrews FacultyofInformationTechnology
QueenslandUniversityofTechnology POBox2434
Brisbane,QLD4000
Australia
MathildeE.Boon LeidenCytologyandPathologyLaboratory POBox16084
2301GBLeiden
TheNetherlands
EmmaA.Braithwaite
OxfordBiosignals
MagdalenCentre
Oxford OxfordshireOX44GA
UK
SimonS.Cross DepartmentofPathology DivisionofGenomicMedicine UniversityofSheYeldMedicalSchool
BeechHillRoad
SheYeldS102UL
UK
JoachimDiederich FacultyofInformationTechnology
QueenslandUniversityofTechnology POBox2434
Brisbane,QLD4000
Australia
JosephDowns DeptofAutomaticControl andSystems Engineering UniversityofSheYeld MappinStreet SheYeldS13JD
UK
JimmyDripps IntegratedSystemsGroup(ISG) ElectronicsandElectricalEngineering EdinburghUniversity MayWeldRoad EdinburghEH93JL
UK
Richard Dybowski Envisionment 143VillageWay Pinner
MiddlesexHA55AA
UK
DavidB.Fogel NaturalSelectionInc.
3333N.TorreyPinesCt Suite200
LaJolla,CA92037
USA
VanyaGant DeptofClinicalMicrobiology UniversityCollegeHospital LondonWC1E6DB
UK
RichardM.Golden
AppliedCognitionandNeuroscience Program
SchoolofHumanDevelopment,GR41 UniversityofTexasatDallas Richardson,TX75086-0688
USA
RoystonGoodacre InstituteofBiologicalSciences CledwynBuilding TheUniversityofWales Aberystwyth CeredigionSY233DD
UK
RobertF.Harrison DeptofAutomaticControlandSystems Engineering UniversityofSheYeld MappinStreet SheYeldS13JD
UK
R.LeeKennedy DeptofGeneralInternalMedicine SunderlandRoyalHospital KayllRoad SunderlandSR47TP
UK
LambrechtP.Kok DeptofBiomedical Engineering UniversityofGroningen Nijenborgh4 9747AGGroningen
TheNetherlands
CheePengLim DeptofAutomaticControlandSystems Engineering
UniversityofSheYeld MappinStreet SheYeldS13JD
UK
AndrewJ.Lyon NeonatalUnit
SimpsonMemorial MaternityPavilion EdinburghEH39EF
UK
AlanMurray IntegratedSystemsGroup ElectronicsandElectricalEngineering EdinburghUniversity MayWeldRoad EdinburghEH93JL
UK
CraigS.Niederberger DeptofUrologyM/C955 UniversityofIllinoisatChicago 840SouthWoodStreet Chicago,IL60612
USA
JamesPardey OxfordInstrumentsMedicalLtd ManorWay OldWoking SurreyGU229JU
UK
DavidA.Peterson DeptofComputerScience
ColoradoStateUniversity FortCollins,CO80523-1873
USA
V.WilliamPorto NaturalSelectionInc.
3333N.TorreyPinesCt Suite200
LaJolla,CA92037
USA
BrianD.Ripley DeptofStatistics UniversityofOxford 1SouthParksRoad OxfordOX13TG
UK
RuthM.Ripley DeptofStatistics UniversityofOxford 1SouthParksRoad OxfordOX13TG
UK
StephenJ.Roberts DeptofEngineeringScience UniversityofOxford ParksRoad OxfordOX13PJ
UK
SusanRodway 12King’sBenchWalk
LondonEC4Y7EL
UK
LionelTarassenko DeptofEngineeringScience UniversityofOxford ParksRoad OxfordOX13PJ
UK
AlanB.Tickle FacultyofInformationTechnology
QueenslandUniversityofTechnology POBox2434
Brisbane,QLD4000
Australia
JeremyWyatt SchoolofPublicPolicy UniversityCollegeLondon 29/30TavistockSquare LondonWC1H9EZ
UK
MayelaZamora DeptofEngineeringScience UniversityofOxford ParksRoad OxfordOX13PJ
UK
Introduction
RichardDybowskiandVanyaGant
Inthisintroductionweoutlinethetypesofneuralnetworkfeaturedinthisbook andhowtheyrelatetostandardstatisticalmethods.Wealsoexaminetheissueof theso-called‘black-box’ aspectofneuralnetworkandconsider somepossible futuredirectionsinthecontextofclinicalmedicine.Finally,weoverviewthe remainingchapters.
Afewevolutionarybranches
Thestructureofthebrainasacomplexnetworkofmultiplyconnectedcells (neuralnetworks)wasrecognizedinthelate19thcentury,primarilythroughthe workoftheItaliancytologistGolgiandtheSpanishhistologistRamo´nyCajal.1 Withinthereductionistapproachtocognition(Churchland1986),thereappeared thequestionofhowcognitivefunctioncouldbemodelledbyartiWcialversionsof thesebiologicalnetworks.Thiswastheinitialimpetusforwhathasbecomea diversecollectionofcomputationaltechniquesknownas artiWcialneuralnetworks (ANNs).
ThedesignofartiWcialneuralnetworkswasoriginallymotivatedbythephenomenaoflearningandrecognition,andthedesiretomodelthesecognitive processes.But,startinginthemid-1980s,amorepragmaticstancehasemerged, andANNsarenowregardedasnon-standardstatisticaltoolsforpatternrecognition.Itmustbeemphasizedthat,inspiteoftheirbiologicalorigins,theyarenot ‘computersthatthink’,nordotheyperform‘brain-like’computations.
The‘evolution’ofartiWcialneuralnetworksisdivergentandhasresultedina widevarietyof‘phyla’and‘genera’. Ratherthanexaminethedevelopmentofevery branchoftheevolutionarytree,wefocusonthoseassociatedwiththetypesof ANNmentionedinthisbook,namelymultilayerperceptrons(Chapters2–8, 10–13),radialbasisfunctionnetworks(Chapter12),Kohonenfeaturemaps (Chapters2,5),adaptiveresonancetheorynetworks(Chapters2,9),andneurofuzzynetworks(Chapters10,12).
WehavenotsetouttoprovideacomprehensivetutorialonANNs;instead,we
Figure1.1.AgraphicalrepresentationofaMcCulloch–Pittsneuron,andalsoofasingle-layerperceptron.Intheformer,adiscontinuousstepfunctionisappliedtotheweightedsum w0 + w1x1 +···+ wd xd toproducetheoutput y;inthelatter,thestepfunctionisreplacedby a continuoussigmoidalfunction.
havesuggestedsourcesofinformationthroughoutthetext,andwehaveprovided somerecommendedreadinginAppendix1.1.
Multilayerperceptrons
Atthestartofthe20thcentury,anumberofgeneralbutnon-mathematical theoriesofcognitionexisted,suchasthoseofHelmholtzandPavlov.Atthe UniversityofPittsburghinthe1920s,NicolasRashevsky,aphysicist,begana researchprogrammetoplacebiologywithintheframeworkofmathematical physics.Thisinvolvedanumberofprojects,includinganattempttomathematicallymodelPavlovianconditioningintermsofneuralnetworks(Rashevsky 1948).HecontinuedhisworkattheUniversityofChicago,wherehewasjoinedby WarrenMcCulloch,aneuroanatomist,andthen,in1942,byamathematical prodigycalledWalterPitts.Together,McCulloch&Pitts(1943)devisedasimple modeloftheneuron.Inthismodel(Figure1.1),theinputsignals x1,..., xd toa neuronareregardedasaweightedsum w0 + w1x1 +···+ wdxd.Ifthesumexceedsa predeWnedthresholdvalue,theoutputsignal y fromtheneuronequals1;otherwise,itis0.However,aMcCulloch–Pittsneuronbyitselfiscapableonlyofsimple tasks,namelydiscriminationbetweensetsofinputvaluesseparablebya(possibly multidimensional)plane.Furthermore,theweights requiredfortheneuronsofa networkhadtobeprovidedasnomethodforautomaticallydeterminingthe weightswasavailableatthattime.
Rosenblatt(1958)proposedthattheMcCulloch–Pittsneuroncouldbethebasis ofasystemabletodistinguishbetweenpatternsoriginatingfromdiVerentclasses. Thesystem,whichhedubbeda perceptron,wasaMcCulloch–Pittsneuronwith preprocessedinputs.2 MotivatedbyHebb’s(1949)hypothesisthatlearningis basedonthereinforcementofactiveneuronalconnections,Rosenblatt(1960,
Figure1.2.Amultilayerperceptronwithtwolayersofweights.Thefirstlayerofnodes,whichreceivethe inputs x 1,..., xd ,iscalledthe inputlayer. Thelayerofnodesproducingtheoutputvaluesis calledthe outputlayer. Layersofnodesbetweentheinputandoutputlayersarereferredto as hiddenlayers. Theweightedsum hj atthe j-thhiddennodeisgivenby
.Thevaluefromthe j-thhiddennodetotheoutputnodeisafunction fhid of hj,andtheoutput y(x; w)isafunctionof fout oftheweightedsum
).Functions fhid and fout aretypicallysigmoidal.Notethata multilayerperceptroncanhavemorethanonelayerofhiddennodesandmorethanone nodeprovidingoutputvalues.
1962)developedthe perceptronlearningrule anditsassociatedconvergencetheorem.ThissolvedtheproblemofaMcCulloch–Pittsneuron‘learning’asetof weights.Anumberofworkers(e.g.Block1962)provedthatthelearningrule, whenappliedtoaperceptronconsistingofonlyasinglelayerofweights,would alwaysmodifytheweightssoastogivetheoptimalplanardecisionboundary possibleforthatperceptron.
Multilayerperceptrons (MLPs)areperceptronshavingmorethanonelayerof weights(Figure1.2),whichenablesthemtoproducecomplexdecisionboundaries.Unfortunately,aspointedoutbyMinsky&Papert(1969),theperceptron learningruledidnotapplytoMLPs,3 afactthatseverelylimitedthetypesof problemtowhichperceptronscouldbeapplied.Thiscausedmanyresearchersto leavethe Weld,therebystartingthe‘DarkAges’ofneuralnetworks,duringwhich littleresearchwasdone.Theturningpointcameinthemid-1980swhenthe back-propagationalgorithmfortrainingmultilayerperceptronswasdiscovered independentlybyseveralresearchers(LeCun1985;Parker1985;Rumelhartetal. 1986).4 ThisansweredthecriticismsofMinsky&Papert(1969),andtheRenaissanceofneuralnetworksbegan.
Multilayerperceptronswithsigmoidalhiddennodefunctionsarethemost commonlyusedANNs,asexempliWedbythecontributionstothisbookandthe reviewsbyBaxt(1995)andDybowski&Gant(1995).EachhiddennodeinFigure 1.2producesahyperplaneboundaryinthemultidimensionalspacecontainingthe inputdata.Theoutputnodesmoothlyinterpolatesbetweentheseboundariesto givedecisionregionsoftheinputspaceoccupiedbyeachclassofinterest.Witha
singlelogisticoutputunit,MLPscanbeviewedasanon-linearextensionof logisticregression,and,withtwolayersofweights,theycanapproximateany continuousfunction(Blum&Li1991).5 AlthoughtraininganMLPbybackpropagationcanbeaslowprocess,therearefasteralternativessuchas Quickprop (Fahlman1988).
AparticularlyeloquentdiscussionofMLPsisgivenbyBishop(1995,Chap.4) inhisbook NeuralNetworksforPatternRecognition.
Astatisticalperspectiveonmultilayerperceptrons
ThegenesisandrenaissanceofANNstookplacewithinvariouscommunities,and articlespublishedduringthisperiodreXectthedisciplinesinvolved:biologyand cognition,statisticalphysics,andcomputerscience.Butitwasnotuntiltheearly 1990sthataprobability-theoreticperspectiveemerged,withBridle(1991),Ripley (1993),Amari(1993)andCheng&Titterington(1994)beingamongstthe Wrstto regardANNsasbeingwithintheframeworkofstatistics.Thestatisticalaspectof ANNshasalsobeenhighlightedintextbooksbySmith(1993),Bishop(1995)and Ripley(1996).
ArecurringthemeofthisliteratureisthatmanyANNsareanalogousto,or identicalwith,existingstatisticaltechniques.Forexample,apopularstatistical methodformodellingtherelationshipbetweenabinaryresponsevariable y anda vector(anorderedset)ofcovariates x is logisticregression (Hosmer&Lemeshow 1989;Collett1991),butconsiderthesingle-layerperceptronofFigure1.1:
(x ; w)= fout
Iftheoutputfunction fout ofEq.(1.1)islogistic, fout(r)=1+exp[ (r)] 1 , (where r isanyvalue)andtheperceptronistrainedbyacross-entropyerror function,Eq.(1.1)willbefunctionallyidenticalwithamain-eVectslogistic regressionmodel
UsingthenotationofFigure1.2,theMLPcanbewrittenas
butHwangetal.(1994)haveindicatedthatEq.(1.2)canberegardedasa
particulartypeofprojectionpursuitregressionmodelwhen fout islinear:
Projectionpursuitregression (Friedman&Stuetzle1981)isanestablishedstatistical techniqueand,incontrasttoanMLP,eachfunction fj inEq.(1.3)canbediVerent, therebyprovidingmore Xexibility.6 However,RipleyandRipley(Chapter11) pointoutthatthestatisticalalgorithmsfor Wttingprojectionpursuitregressionare notaseVectiveasthosefor WttingMLPs.
Anotherparallelbetweenneuralandstatisticalmodelsexistswithregardtothe problemofoverWtting.InusinganMLP,theaimistohavetheMLPgeneralize fromthedataratherthanhaveit Wttothedata(overWtting).OverWttingcanbe controlledforbyaddinga regularizationfunction totheerrorterm(Poggioetal. 1985).ThisadditionaltermpenalizesanMLPthatistoo Xexible.Instatistical regressionthesameconceptexistsintheformofthe Akaikeinformationcriterion (Akaike1974).Thisisalinearcombinationofthedevianceandthenumberof independentparameters,thelatterpenalizingtheformer.Furthermore,when regularizationisimplementedusingweightdecay(Hinton1989),acommon approach,themodellingprocessisanalogoustoridgeregression(Montgomery& Peck1992,pp.329–344)–aregressiontechniquethatcanprovidegoodgeneralization.
OnemayaskwhethertheapparentsimilaritybetweenANNsandexisting statisticalmethodsmeansthatANNsareredundantwithinpatternrecognition. OneanswertothisisgivenbyRipley(1996,p.4):
Thetraditionalmethodsofstatisticsandpatternrecognitionareeither parametric basedona familyofmodelswithasmallnumberofparameters,or non-parametric inwhichthemodels usedaretotallyflexible.Oneoftheimpacts ofneuralnetworkmethodsonpatternrecognitionhasbeentoemphasizetheneedinlarge-scalepracticalproblemsforsomethingin between,familiesofmodelswithlargebutnotunlimitedflexibilitygivenbyalargenumberof parameters.Thetwomostwidelyusedneuralnetworkarchitectures, multi-layerperceptrons and radialbasisfunctions (RBFs),providetwosuchfamilies(andseveralothersalreadyin existence).
Inotherwords,ANNscanactas semi-parametric classiWers,whicharemore Xexiblethanparametricmethods(suchasthequadraticdiscriminantfunction (e.g.Krzanowski1988))butrequirefewermodelparametersthannon-parametric methods(suchasthosebasedonkerneldensityestimation(Silverman1986)). However,settingupasemi-parametricclassiWercanbemorecomputationally intensivethanusingaparametricornon-parametricapproach.
AnotherresponseistopointoutthatthewidespreadfascinationforANNshas
attractedmanyresearchersandpotentialusersintotherealmofpatternrecognition.Itistruethattheneural-computingcommunityrediscoveredsomestatistical conceptsalreadyinexistence(Ripley1996),butthisinXuxofparticipantshas creatednewideasandreWnedexistingones.ThesebeneWtsincludethe learningof sequences bytimedelayandpartialrecurrence(Lang&Hinton1988;Elman1990) andthecreationofpowerfulvisualizationtechniques,suchas generativetopographicmapping (Bishopetal.1997).ThustheANNmovementhasresultedin statisticianshavingavailabletothemacollectionoftechniquestoaddtotheir repertoire.Furthermore,theplacementofANNswithinastatisticalframework hasprovideda Wrmertheoreticalfoundationforneuralcomputation,andithas led tonewdevelopmentssuchastheBayesianapproachtoANNs(MacKay 1992). Unfortunately,therebirthofneuralnetworksduringthe1980shasbeen accompaniedbyhyperboleandmisconceptionsthathaveledtoneuralnetworks beingtrainedincorrectly.Inresponse tothis,Tarassenko(1995)highlighted three areaswherecareisrequiredinordertoachievereliableperformance: Wrstly,there mustbesuYcientdatatoenableanetworktogeneralizeeVectively;secondly, informative featuresmustbeextracted fromthedataforuseasinput toanetwork; thirdly,balancedtrainingsetsshouldbeusedforunderrepresentedclasses(or noveltydetection usedwhenabnormalitiesareveryrare(Tarassenkoetal.1995)).
Tarassenko(1998)discussedthesepointsindetail,andhestated:
Itiseasytobecarriedawayandbegintooverestimatetheircapabilities.Theusualconsequenceofthisis, hopefully,nomoreseriousthan anembarrassingfailurewithconcomitant mutteringsaboutblackboxesandexcessivehype.Neuralnetworkscannotsolveevery problem.Traditionalmethodsmaybebetter.Nevertheless,neuralnetworks,whentheyare usedwisely,usuallyperformatleastaswellasthemostappropriatetraditionalmethodand insomecasessignificantlybetter.
Itshouldalsobeemphasizedthat,evenwithcorrecttraining,anANNwillnot necessarilybethebestchoiceforaclassiWcationtaskintermsofaccuracy.Thishas beenhighlightedbyWyatt(1995),whowrote:
Neuralnetadvocatesclaimaccuracyasthemajoradvantage.However,whenalarge Europeanresearchproject,StatLog,examinedtheaccuracyoffiveANNand19traditional statisticalordecision-treemethodsforclassifying22setsofdata,includingthreemedical datasets[Michieetal.1994],aneuraltechniquewasthemostaccurateinonlyonedataset, onDNAsequences.For15(68%)ofthe22sets,traditionalstatisticalmethodswerethemost accurate,andthose15includedallthreemedicaldatasets.
ButoneshouldaddthecommentmadebyMichieetal.(1994,p.221)onthe resultsoftheStatLogproject:
Withcare,neuralnetworksperformverywellasmeasuredbyerrorrate.Theyseemtoprovide eitherthebestornearbestpredictiveperformanceinnearlyallcases...
Figure1.3.Aradialbasisfunctionnetwork.Thenetworkhasasinglelayerofbasisfunctionsbetweenthe inputandoutputlayers.Thevalueof j producedbythe j-thbasisfunctionisdependenton thedistancebetweenthe‘centre’ x [ j] ofthefunctionandthevectorofinputvalues x1,..., x d
Theoutput y(x; w)istheweightedsum w0 + w1 1 +···+ w m m .Notethataradialbasis functionnetworkcanhavemorethanoneoutputnode,andthefunctions 1,..., m neednot bethesame.
Nevertheless,whenanANNisbeingevaluated,itsperformancemustbecompared withthatobtainedfromoneormoreappropriatestandardstatisticaltechniques.
Radialbasisfunctionnetworks
UnlikeMLPs,anumberofso-called‘neuralnetworks’werenotbiologically motivated,andoneoftheseistheradialbasisfunctionnetwork.Originally conceivedinordertoperformmultivariateinterpolation(Powell1987), radial basisfunctionnetworks (RBFNs)(Broomhead&Lowe1988)areanalternativeto MLPs.LikeanMLP,anRBFNhasinputandoutputnodes;buttherethesimilarity ends,foranRBFNhasamiddlelayerofradiallysymmetricfunctionscalled basis functions,eachofwhichcanbedesignedseparately(Figure1.3).Theideaofusing basisfunctionsoriginatesfromtheconceptofpotentialfunctionsproposedby Bashkirovetal.(1964)andillustratedbyDuda&Hart(1973).
Eachbasisfunctioncanberegardedasbeingcentredonaprototypicvectorof inputvalues.WhenavectorofvaluesisappliedtoanRBFN,ameasureofthe proximityofthevectortoeachoftheprototypesisdeterminedbythecorrespondingbasisfunctions,andaweightedsumofthesemeasuresisgivenastheoutputof theRBFN(Figure1.3).
ThebasisfunctionsdeWne localresponses (receptive Welds)(Figure1.4).Typically,onlysomeofthehiddenunits(basisfunctions)producesigniWcantvalues forthe Wnallayers.ThisiswhyRBFNsaresometimesreferredtoas localized receptive Weldnetworks.Incontrast,allthehidden unitsofanMLPareinvolvedin determiningtheoutputfromthenetwork(theyaresaidtoforma distributed representation).Thereceptive Weldapproachcanbeadvantageouswhenthe
Figure1.4.Schematicrepresentationofpossibledecisionregionscreatedby(a)thehyperplanesofa multilayerperceptron,and(b)thekernelfunctionsofaradialbasisfunctionnetwork.The circlesandcrossesrepresentdatapointsfromtworespectiveclasses.
distributionofthedatainthespaceofinputvaluesismultimodal(Wilkinsetal. 1994).Furthermore,RBFNscanbetrainedmorequicklythanMLPs(Moody& Darken1989),butthenumberofbasisfunctionsrequiredcangrowexponentially withthenumberofinputnodes(Hartmanetal.1990),andanincreaseinthe numberofbasisfunctionsincreasesthetimetaken,andamountofdatarequired, totrainanRBFNadequately.
Undercertainconditions(White1989;Lowe&Webb1991;Nabney1999),an RBFNcanactasaclassiWer.AnadvantageofthelocalnatureofRBFNscompared withMLPclassiWersisthatanewsetofinputvaluesthatfallsoutsideallthe localizedreceptor Weldscouldbe Xaggedasnotbelongingtoanyoftheclasses represented.Inotherwords,thesetofinputvaluesisnovel.Thisisamore cautiousapproachthantheresoluteclassiWcationthatcanoccurwithMLPs,in whichasetofinputvaluesisalwaysassignedtoaclass,irrespectiveofthevalues. ForfurtherdetailsonRBFNs,seeBishop(1995,Chap.5).
Astatisticalperspectiveonradialbasisfunctionnetworks
Asimplelineardiscriminantfunction(Hand1981,Chap.4)hastheform
g(x)= w0 + d i=1 wixi.(1.4)
with x assignedtoaclassofinterestif g(x)isgreaterthanapredeWnedconstant. Thisprovidesaplanardecisionsurfaceandisfunctionallyequivalenttothe McCulloch–Pittsneuron.Equation(1.4)canbegeneralizedtoalinearfunctionof functions,namelya generalizedlineardiscriminantfunction
g(x)= w0 + m i=1 wi f (x),(1.5)
whichpermitstheconstructionofnon-lineardecisionsurfaces.Ifwerepresentan RBFNbytheexpression
g(x)= w0 + m i=1 wi i( x x[i] ),(1.6)
where x x[i] denotesthedistance(usuallyEuclidean)betweeninputvector x andthe‘centre’ x[i] ofthe i-thbasisfunction i,comparisonofEq.(1.5)withEq. (1.6)showsthatanRBFNcanberegardedasatypeofgeneralizedlinear discriminantfunction.
MultilayerperceptronsandRBFNsaretrainedby supervisedlearning.This meansthatanANNispresentedwithasetofexamples,eachexamplebeingapair (x, t),where x isavectorofinputvaluesfortheANN,and t isthecorresponding targetvalue,forexamplealabeldenotingtheclasstowhich x belongs.Thetraining algorithmadjuststheparametersoftheANNsoastominimizethediscrepancy betweenthetargetvaluesandtheoutputsproducedbythenetwork.
IncontrasttoMLPsandRBFNs,theANNsinthenexttwosectionsarebasedon unsupervisedlearning.In unsupervisedlearning,therearenotargetvaluesavailable,onlyinputvalues,andtheANNattemptstocategorizetheinputsintoclasses. Thisisusuallydonebysomeformofclusteringoperation.
Kohonenfeaturemaps
ManypartsofthebrainareorganizedinsuchawaythatdiVerentsensoryinputs aremappedtospatiallylocalizedregionswithinthebrain.Furthermore,these regionsarerepresentedby topologicallyorderedmaps.Thismeansthatthegreater thesimilaritybetweentwostimuli,thecloserthelocationoftheircorresponding excitationregions.Forexample,visual,tactileandauditorystimuliaremapped ontodiVerentareasofthecerebralcortexinatopologicallyorderedmanner (Hubel&Wiesel1977;Kaasetal.1983;Suga1985).Kohonen(1982)wasoneofa groupofpeople(othersincludeWillshaw&vonderMalsburg(1976))who devisedcomputationalmodels ofthisphenomenon.
TheaimofKohonen’s(1982) self-organizingfeaturemaps (SOFMs)istomapan inputvectortooneofasetofneuronsarrangedinalattice,andtodosoinsucha waythatpositionsininputspacearetopologicallyorderedwithlocationsonthe lattice.Thisisdoneusingatrainingsetofinputvectors (1),..., (m)andasetof prototypevectors w(1),..., w(n)ininputspace.Eachprototypevector w(i)is associatedwithalocation S(i)on(typically)alattice(Figure1.5).
AstheSOFMalgorithmpresentseachinputvector tothesetofprototype vectors,thevector w(i*)nearestto ismovedtowards accordingtoalearning
Figure1.5.AgraphicaldepictionofKohonen’sself-organizingfeaturemap.Seepp.9–10foran explanation.Thelatticeistwo-dimensional,whereasdatapoint(inputvector) andprototypevectors w(i*)and w(h)resideinahigher-dimensional(input)space.
rule.Indoingso,thealgorithmalso‘drags’towards (buttoalesserextent)those prototypevectorswhoseassociatedlocationsonthelatticeareclosestto S(i*), where S(i*)isthelatticelocationassociatedwith w(i*).Forexample, w(h)in Figure1.5isdraggedalongwith w(i*)towards .Hertzetal.(1991)likenedthis processtoanelasticnet,existingininputspace,whichwantstocomeascloseas possibleto (1),..., (m).Thecoordinatesoftheintersectionsoftheelasticnet aredeWnedbytheprototypevectors w(1),..., w(n).Ifsuccessful,twolocations S(j)and S(k)onthelatticewillbeclosertoeachothertheclosertheirassociated prototypevectors w(j)and w(k)arepositionedininputspace.
TheSOFMalgorithmprovidesameansofvisualizingthedistributionofdata pointsininputspace,but,aspointedoutbyBishop(1995),thiscanbeweakifthe datadonotliewithin atwo-dimensionalsubspaceofthehigher-dimensional spacecontainingthedata.AnotherproblemwithSOFMisthatthe‘elasticnet’ couldtwistasitmovestowardsthetrainingset,resultinginadistortedvisualizationofthedata(e.g.Haganetal.1996).
ForthosewishingtoknowmoreaboutSOFMs,werecommendthebook NeuralComputationandSelf-OrganizingMaps byRitteretal.(1992).
Adaptiveresonancetheorynetworks
Afeatureofcognitivesystemsisthattheycanbereceptivetonewpatterns (describedas plasticity)butremainunchangedtoothers(describedas stability).
Thevexingquestionofhowthisispossiblewasreferredtoasthe stability/plasticity dilemma (Grossberg1976),butCarpenter&Grossberg(1987)developedatheory called adaptiveresonancetheory (ART)toexplainthisphenomenon.
Intermsofdesign,ARTnetworksarethemostcomplexANNgiveninthis book,yettheprincipleisquitestraightforward.Caudill&Butler(1990)regardthe processasatypeofhypothesistest.Apatternpresentedataninputlayerispassed toasecondlayer,whichisinterconnectedtothe Wrst.Thesecondlayermakesa guessaboutthecategorytowhichtheoriginalpatternbelongs,andthishypotheticalidentityispassedbacktothe Wrstlayer.Thehypothesisiscomparedwiththe originalpatternand,iffoundtobeaclosematch,thehypothesisandoriginal patternreinforceeachother(resonance issaidtotakeplace).Butifthehypothesis isincorrect,thesecondlayerproducesanotherguess.Ifthesecondlayercannot eventuallyprovideagoodmatchwiththepattern,theoriginalpatternislearnedas the Wrstexampleofanewcategory.
AlthoughARTprovidesunsupervisedlearning,anextensioncalledARTMAP (Carpenteretal.1991)combinestwoARTmodulestoenablesupervisedlearning totakeplace.
Inspiteofresolvingthestability/plasticitydilemma,theARTalgorithmsare sensitivetonoise(Moore1989).Furthermore,Ripley(1996)questionsthevirtue oftheARTalgorithmsoveradaptive k-meansclustering,suchasthatofHall& Khanna(1977).
DetailsoftheARTconceptareprovidedbyBeale&Jackson(1990,Chap.7)and Hertz,Krogh&Palmer(1991,pp.228–32).
Neuro-fuzzynetworks
Althoughprobabilitytheoryistheclassicapproachtoreasoningwithuncertainty, Zadeh(1962)arguedthatthereexistlinguisticterms,suchas‘most’and‘approximate’,whicharenotdescribableintermsofprobabilitydistributions.Hethenset aboutdevelopingamathematicalframeworkcalled fuzzysettheory (Zadeh1965) toreasonwithsuchqualitativeexpressions.Inclassicalsettheory,anobjectis eitheramemberofasetoritisnot;infuzzysettheory,gradesofmembershipare allowed,thedegreeofmembersshipbeingdeWnedbya membershipfunction.
AtatimewhenrepresentationofknowledgewasafocalpointinartiWcial intelligenceresearch,Zadeh(1972)suggestedthatcontrolexpertisecouldbe representedusingasetoflinguisticif–thenrulesacquiredfromanoperator.Inhis scheme,executionoftheresulting fuzzycontroller wouldbebasedontheformal rulesoffuzzysettheory.ButthislefttheproblemofdeWningthemembership functionsincorporatedinafuzzysystem.
A neuro-fuzzysystem determinestheparametersofthemembershipfunctionsof
afuzzysystemfromexamplesbymeansofaneuralnetwork.Eitherthefuzzy systemandtheneuralnetworkaretwodistinctentities(collaborativeneuro-fuzzy systems;e.g.Nomuraetal.1992)orthefuzzysystemhasaneural-net-like architecture(a hybridneuro-fuzzysystem).ThevarioustypesofhybridneurofuzzysystemincludesystemsanalogoustoMLPs(Berenji1992),toRBFNs(Dabija &Tschichold-Gu¨rman1993),andtoKohonenfeaturemaps(Pedrycz&Card 1992).
Moreinformationonneuro-fuzzynetworkscanbefoundinthetextbook FoundationsofNeuro-FuzzySystems byNaucketal.(1997).
The‘black-box’issue
Acriticismlevelledagainstneuralnetworksisthattheyare‘black-box’systems (Sharp1995;Wyatt1995).Bythisitismeantthatthemannerinwhichaneural networkderivesanoutputvaluefromagivenfeaturevectorisnotcomprehensible tothenon-specialist,andthatthislackofcomprehensionmakestheoutputfrom neuralnetworksunacceptable.Thisissueisencounteredseveraltimesinthisbook, namelyinChapters9,12,and14.
Thereareanumberofpropertiesthatwedesireinamodel,twoofwhichare accuracy(the‘closeness’ofamodel’sestimatedvaluetothetruevalue)and interpretability.By interpretability,wemeanthetypeofinput–outputrelationshipsthatcanbeextractedfromamodelandarecomprehensibletotheintended userofthemodel.AtleastthreetypesofinterpretationcanbeidentiWed:
1.AsummaryofhowachangeineachinputvariableaVectstheoutputvalue. ThistypeofinterpretationisprovidedbytheregressioncoeYcientsofa main-eVectslogisticregressionmodel(Hosmer&Lemeshow1989),avirtueof additivemodelsingeneral(Plate1998).
2.Asummaryofallpossibleinput–outputrelationshipsobtainablefromthe modelasa Wnitesetofif–thenrules.Thissortofinterpretationisprovidedby alltheroot-to-leafpathspresentinatree-structuredclassiWer(Breimanetal. 1984;Buntine1992).
3.Asequentialexplanationthatshowshowtheoutputvalueprovidedbyamodel wasobtainedfromagiveninputvector.Theexplanationusesachainof inferencewithstepsthataremeaningfultotheuserofthemodel.Suchan explanationisprovidedbyamostprobableconWgurationinBayesianbelief networks(Jensen1996,pp.104–107).
Aninterpretablemodelisadvantageousforseveralreasons: Itcouldbeeducationalbysupplyingapreviouslyunknownbutusefulinput–outputsummary.This,inturn,canleadtonewareasofresearch.
Itcoulddiscloseanerrorinthemodelwhenaninput–outputsummaryor explanationcontradictsknownfacts.
Doesthelackofinterpretability,asdeWnedabove,makeamodelunacceptable? Thatdependsonthepurposeofthemodel.Supposethatthechoiceofastatistical modelforagivenproblemisreasonable(ontheoreticalorheuristicgrounds),and anextensiveempiricalassessmentofthemodel(e.g.bycross-validationand prospectiveevaluation)showsthatitsparametersprovideanacceptabledegreeof accuracyoverawiderangeofinputvectors.Theuseofsuchamodelforprediction wouldgenerallybeapproved,subjecttoaperformance-monitoringpolicy.Why notapplythesamereasoningtoneuralnetworks,whichare,afterall,nonstandardstatisticalmodels?
Butsupposethatweareinterestedin knowledgediscovery (Brachman&Anand 1996);bythiswemeantheextractionofpreviouslyunknownbutusefulinformationfromdata.WithatrainedMLP,itisverydiYculttointerpretthemassof weightsandconnectionswithinthenetwork,andtheinteractionsimpliedby these.Thegoalof ruleextraction (Chapter12)istomapthe(possiblycomplex) associationsencodedbythefunctionsandparametersofanANNtoasetof comprehensibleif–thenrules.Ifsuccessful,suchamappingwouldleadtoan interpretablecollectionofstatementsdescribingtheassociationsdiscoveredbythe ANN.
Newdevelopmentsandfutureprospects
WhathaveANNsgottooVermedicineinthefuture?Theanswerisnotsomuch whethertheycan,buthowfartheycanbeusedtosolveproblemsofclinical relevance–andwhetherthiswillbeconsideredacceptable.Medicineisacomplex discipline,buttheabilityofANNstomodelcomplexitymayprovetoberewarding.Complexityinthiscontextcanbebrokendownintothreeelements,eachwith verydiVerentparametersandrequirements.
The Wrstisinmanywaysthe‘purest’andyetthemostimpenetrable,and concernsthecomplexityofindividualcells.Aftertheinitial Xushofenthusiasm, andtheperceivedempowerment andpromisebroughtaboutbytherevolutionof molecularbiology,itsoonbecameapparentthataseeminglyendlessstreamof datapertainingtogeneticsequencewasoflittleavailinitself.Wehavebegunto cometotermswiththeextraordinarynumberofgenesmakingupthemostbasic oflivingorganisms.Addedtothisisthegrowingrealizationthatthesegenes, numberedintheirthousandsinthesimplestoflivingorganisms,interactwith eachotherbothatthelevelofthegenomeitself,andthenattheleveloftheir proteinproducts.Therefore,afundamentaldiYcultyarisesinourabilityto
understandsuchprocessesby‘traditional’methods.Thistensionhasgenerated amongstothersthedisciplineofreversegenomics(Oliver1997),whichattempts toimputefunctiontoindividualgeneswithknownandthereforepenetrable sequencesinthecontextofseeminglyimpenetrablecomplexlivingorganisms.At thetimeofwriting,thepotentialofsuchmathematicalmethodstomodelthese interactionsatthelevelofthesinglecellremainsunexplored.ANNsmayallow complexbiologicalsystemstobemodelledatahigherlevel,throughthoughtful experimentaldesignandnoveldataderivedfromincreasinglysophisticatedtechniquesofphysicalmeasurement.Anybehaviouratthesinglecelllevelproductivelymodelledinthiswaymayhavefundamentalconsequencesformedicine.
Thesecondlevelconcernsindividualdiseasestatesatthelevelofindividual humanbeings.Thecauseformanydiseasescontinuestobeascribed(ifnot understood)totheinteractionbetweenindividualsandtheirenvironment.One exampleheremightbethevariationinhumanresponsetoinfectionwith a virulentpathogen,whereoneindividualwhose(geneticallydetermined)immune systemhasbeenprogrammedbyhisenvironment(Rook&Stanford1998),may liveordiedependingonhowtheimmune systemrespondstotheinvader. Complexdatasetspertainingtogeneticandenvironmentalaspectsinthelife-ordeathinteractionmaybeamenabletoANNmodellingtechniques.Thisquestion oflifeordeathafterenvironmentalinsulthasalreadybeenaddressedusingANNs inthe‘real’contextofoutcomeinintensivecaremedicine(e.g.Dybowskietal. 1996).Weseenoreasonwhysuchanapproachcannotbeextendedtoquestionsof epidemiology.Forexample,geneticandenvironmentalfactorscontributingtothe impressiveworldwidevariationincoronaryheartdiseasecontinuetobeidentiWed (Criqui&Ringel1994),yethowtheseindividualfactorsinteractcontinuesto eludeus.AnANNapproachtosuchformallyunresolvedquestions,whencoupled withruleextraction(Chapter12),mayrevealtheexactnatureandextentof risk-factorinteraction.
Thethirdlevelconcernstheanalysisofclinicalandlaboratoryobservationsand disease.Untilwehavebettertoolstoidentifythosemolecularelementsresponsible forthediseaseitself,werelyonfeaturesassociatedwiththemwhoserelationship todiseaseremainsunidentiWedand,atbest,‘secondhand’.Examplesinthereal worldofclinicalmedicineincludeX-rayappearancessuggestiveofinfectionrather thantumour(Medinaetal.1994),andabnormalhistologicalreportsofuncertain signiWcance(PRISMATICprojectmanagementteam1999).Untilthedisciplineof pathologyrevealsthepresenceorabsenceofsuchabnormalityatthemolecular level,manypathological Wndingscontinuetobecouchedinprobabilisticterms; however,ANNshavethepotentialofmodellingthecomplexityofthedataatthe supramolecularlevel.Wenotesomeprogressinatleasttwooftheseareas:the screeningofcytologicalspecimens,andtheinterpretationof Xow-cytometricdata.
Clinicalpathologylaboratoriesarebeingsubjectedtoanever-increasingworkload.Muchofthedatareceivedbytheselaboratoriesconsistsofcomplex Wgures, suchascytologicalspecimens–objectstraditionallyinterpretedbyexperts–but expertsarealimitedresource.ThesuccessofusingANNstoautomatethe interpretationofsuchobjectshasbeenillustratedbythePAPNETscreening system(Chapter3),andweexpectthattheanalysisofcompleximagesbyANNs willincreasewithdemand.
WenowswitchtoadiVerentchannelinourcrystalballandconsiderthree relativelynewbranchesontheevolutionarytreeofneuralcomputation,allof whichcouldhaveanimpactonclinicallyorientedANNs.The Wrstoftheseis Bayesian neuralcomputation,thesecondissupportvectormachines,andthe thirdisgraphicalmodels.
Bayesianneuralcomputation
Whereasclassicalstatisticsattemptstodrawinferencesfromdataalone, Bayesian statistics goesfurtherbyallowingdatatomodifypriorbeliefs(Lee1997).Thisis donethroughtheBayesianrelationship
p( D) p( )p(D ),
where p( )isthepriorprobabilityofastatement ,and p( D)istheposterior probabilityof followingtheobservationofdata D.AnotherfeatureofBayesian inference,andoneofparticularrelevancetoANNs,isthatunknownparameters suchasnetworkweights w canbeintegratedout,forexample
p(C x, D)= w p(C x, w)p(w D)dw,
where p(C x, w)istheprobabilityofclass C giveninput x andweights w,and p(w D)istheposteriorprobabilitydistributionoftheweights.
TheBayesianapproachhasbeenappliedtovariousaspectsofstatistics(Gelman etal.1995),includingANNs(MacKay1992).Advantagestoneuralcomputation oftheBayesianframeworkinclude: aprincipledapproachto WttinganANNtodataviaregularization(Buntine& Weigend1991), allowanceformultiplesolutionstothetrainingofanMLPbya committee of networks(Perrone&Cooper1993), automaticselectionoffeaturestobeusedasinputtoanMLP(automaticrelevance determination (Neal1994;MacKay1995)).
BayesianANNshavenotyetfoundtheirwayintogeneraluse,but,giventheir
capabilities,weexpectthemtotakeaprominentroleinmainstreamneural computation.
Becauseofitsintrinsicmathematicalcontent,wewillnotgiveadetailed accountoftheBayesianapproachtoneuralcomputationinthisintroduction; instead,werefertheinterestedreadertoBishop(1995,Chap.10).
Supportvectormachines
Althoughtheperceptronlearningrule(seep.3)isabletopositionaplanar decisionboundarybetweentwolinearlyseparableclasses,thelocationofthe boundarymaynotbeoptimalasregardstheclassiWcationoffuturedatapoints. However,ifasingle-layerperceptronistrainedwiththeiterative adatronalgorithm (Anlauf&Biehl1989),theresultingplanardecisionboundarywillbe optimal.
Itcanbeshownthattheoptimalpositionforaplanardecisionboundaryisthat whichmaximizestheEuclideandistancebetweentheboundaryandthenearest exemplarstotheboundaryfromthetwoclasses(the supportvectors)(seee.g. Vapnik1995).
OnewayofregardinganRBFNisasasysteminwhichthebasisfunctions collectivelymapthespaceofinputvaluestoanauxiliaryspace(the featurespace), whereuponasingle-layerperceptronistrainedonpointsinfeaturespaceoriginatingfromthetrainingset.Iftheperceptroncanbetrainedwithaversionofthe adatronalgorithmsuitableforpointsresidinginfeaturespacethentheperceptron willhavebeentrainedoptimally.Suchaniterativealgorithmexists(the kernel adatronalgorithm;Friess&Harrison1998),andtheresultingnetworkisa support vectormachine.Vapnik(1995)derivedanon-iterativealgorithmforthisoptimizationtask,anditishisalgorithmthatisusuallyassociatedwithsupportvector machines.AmodiWcationoftheprocedureexistsforwhenthepointsinfeature spacearenotlinearlyseparable.
Inordertomaximizethelinearseparabilityofthepointsinfeaturespace,abasis functioniscentredoneachdatapoint,buttheresultingsupportvectormachine eVectivelyusesonlythosebasisfunctionsassociatedwiththesupportvectorsand ignorestherest.Furtherdetailsaboutsupportvectormachinescanbefoundinthe bookbyCristianini&Shawe-Taylor(2000).
Neuralnetworksasgraphicalmodels
Withinmathematicsandthemathematicalsciences,itcanhappenthattwo disciplines,developedseparately,arebroughttogether.Wearewitnessingthis typeofunionbetweenANNsandgraphicalmodels.
A(probabilistic) graphicalmodel isagraphicalrepresentation(inthegraphtheoreticsense(Wilson1985))ofthejointprobabilitydistribution p(X1,..., X n)
overasetofvariables X1,..., X n (Buntine1994).7 Eachnodeofthegraphcorrespondstoavariable,andanedgebetweentwonodesimpliesaprobabilistic dependencebetweenthecorrespondingvariables.
Becauseoftheirstructure,graphicalmodelslendthemselvestomodularity,in whichacomplexsystemisbuiltfromsimplerparts.Andthroughthetheorems developedforgraphicalmodels(Jensen1996),soundprobabilisticinferencescan bemadewithrespecttothestructureofagraphicalmodelanditsassociated probabilities.Consequently,graphicalmodelshavebeenappliedtoadiversityof clinicalproblems(seee.g.Kazietal.1998;Nikiforidis&Sakellaropoulos1998).An instructiveexampleistheapplicationofgraphicalmodelstothediagnosisof‘blue’ babies(Spiegelhalteretal.1993).
Thenodesofagraphicalmodelcancorrespondtohiddenvariablesaswellas toobservablevariables;thusMLPs(andRBFNs)canberegardedasdirected graphicalmodels,forbothhavenodes,hiddenandvisible,linkedby directed edges(Neal1992).AnexampleofthisisBishop’sworkonlatentvariablemodels, whichhehasregardedfrombothneuralnetworkandgraphicalmodelviewpoints (Bishopetal.1996;Bishop1999). ButgraphicalmodelsarenotconWnedtothe layeredstructureofMLPs;therefore,thestructureofagraphicalmodelcan,in principle,provideamoreaccuratemodelofajointprobabilitydistribution (Binderetal.1997),andthusamoreaccurateprobabilitymodelinthose situationswherethevariablesdictatesuchapossibility.
Inthe1970sandearly1980s,knowledge-basedsystemwerethefocusofapplied artiWcialintelligence,buttheso-called‘knowledge-acquisitionbottleneck’shifted thefocusduringthe1980stomethods,suchasANNs,inwhichknowledgecould beextracteddirectlyfromdata.Thereisnowinterestincombiningbackground knowledge(theoreticalandheuristical)withdata,andgraphicalmodelsprovidea suitableframeworktoenablethisfusiontotakeplace.ThusauniWcationor integrationofANNswithgraphicalmodelsisanaturaldirectiontoexplore.
Overviewoftheremainingchapters
ThisbookcoversawiderangeoftopicspertainingtoartiWcialneuralnetworksfor clinicalmedicine,andtheremainingchaptersaredividedintofourparts:I Applications,IIProspects,IIITheoryandIVEthicsandClinicalPractice.The Wrst ofthese,Applications,isconcernedwithestablishedorprototypicmedicaldecisionsupportsystemsthatincorporateartiWcialneuralnetworks.Thesection beginswithanarticlebyCross(Chapter2),whoprovidesanextensivereviewof howartiWcialneuralnetworkshavedealtwiththeexplosionofinformationthat hastakenplacewithinclinicallaboratories.Thisincludeshepatological,radiologicalandclinical-chemicalapplications,amongstothers.
ThePAPNETsystemforscreeningcervicalcarcinomawasoneofthe Wrstneural computationalsystemsdevelopedformedicaluse.BoonandKok(Chapter3)give anupdateonthissystem,andtheydothisfromtheviewpointsofthevarious partiesinvolvedinthescreeningprocess,suchasthepatient,pathologistand gynaecologist.
QUESTARisoneofthemostsuccessfulartiWcialneuralnetwork-basedsystems developedformedicine,andTarassenkoetal.(Chapter4)describehowQUESTAR/BioSleepanalysesthesleepofpeoplewithseveredisorderssuchasobstructivesleepapnoeaandCheyne–Stokesrespiration.
Chapter5byBraithwaiteetal.describes Mary,aprototypiconlinesystem designed topredicttheonsetofrespiratorydisordersinbabiesthathavebeenborn prematurely.Theauthorshavecomparedtheperformanceofthemultilayer perceptronincorporatedwithin Mary withthatofalineardiscriminantclassiWer, andtheyalsodescribesomepreliminary WndingsbasedonKohonenselforganizingfeaturemaps.
NiederbergerandGolden(Chapter6)describeanotherapplicationbasedon multilayerperceptrons,namely,the neUROnurologicalsystem.Thispredictsstone recurrencefollowingextracorporealshockwavelithotripsy,anon-invasiveprocedureforthedisruptionandremovalofrenalstones.AswithChapter5,they comparetheperformanceoftheMLPwithalineardiscriminantclassiWer.Theyalso describetheuseofWilk’sgeneralizedlikelihoodratiotesttoelectwhichvariablesto useasinputforthemultilayerperceptron.Aninterestingadjuncttotheirworkisthe availabilityofademonstrationofneUROnviatheWorldWideWeb.
ThissectioncloseswithareviewbyGoodacre(Chapter7)ontheinstrumental approachestotheclassiWcationofmicroorganismsandtheuseofmultilayer perceptronstointerprettheresultingmultivariatedata.Thisworkisaresponseto thegrowingworkloadofclinicalmicrobiologylaboratories,andtheneedforrapid andaccurateidentiWcationofmicroorganismsforclinicalmanagementpurposes.
InthesectionentitledProspects,anumberoffeasibilitystudiesarepresented. The WrstoftheseisbyAndersonandPeterson(Chapter8),whoprovidea descriptionofhowfeedforwardnetworkswereusedfortheanalysisofelectroencephalographwaveforms.Thisincludesadescriptionofhowindependentcomponentsanalysiswasusedtoaddresstheproblemofeye-blinkcontamination.
ARTMAPnetworksareoneoftheleast-usedANNtechniques.Thesenetworks provideaformofruleextractiontocomplementtherule-extractiontechniques developedformultilayerperceptrons,andHarrisonetal.(Chapter9)describe howARTMAPandfuzzyARTMAPcanbeusedtoautomaticallyupdatea knowledgebaseovertime.Theydosointhecontextoftheelectrocardiograph (ECG)diagnosisofmyocardialinfarctionandthecytopathologicaldiagnosisof breastlesions.
Another random document with no related content on Scribd:
The Recalcitrant
By Evelyn Goldstein
He was proud of his shining strength and his home in the bee-loud glade. Why did men seek only his destruction?
[Transcriber's Note: This etext was produced from Fantastic Universe September 1954. Extensive research did not uncover any evidence that the U.S. copyright on this publication was renewed.]
We're convincedthatonlya woman couldhave written this story. There is a heartbreaking qualityofsuspensetoit—oftendernessdiffused through a web ofhighpoetry. Thecompassion is wholly womanly, but the breath of a fierce vitality stirs in it too. Evelyn Goldstein has captured the tragedy of the not-quite-human withadeftnessextraordinary.
This golden day was pollen-scented, warmed by the mid-summer sun. Out in the garden the breeze was slight. And a great furred honeybee circled and dipped, touching the vivid azaleas, drinking the heart of the iris, and swiftly rising to the purple rhododendron cups. Dark green ivy twined the porch railing of the trim white cottage.
From behind the curtains of fragile glasseen Jim Simson peered out at his garden with caution and longing. He could almost feel the moist rich soil in his fingers, could almost smell the blossoms through the tight closed windows. In all weathers, on brilliant days and blue and silver nights, he and Amelia had worked the garden. Now the hours of planting and tending were done. He raised haunted eyes to the hills beyond. Between the young pines on the sweet sloped breasts he could see the pale thread of road. Momentarily, where a curve brought it into view, the sun glinted on the metal helicar that moved purposefully toward him.
They'll be here in an hour , he thought. And then they'll take and destroywhatIam.AndI'llloseAmeliaforever.
That was the thing he could not bear. Worse than torment it seemed, worse than destruction itself.
In agony he turned. The cool comfort of his house made fantastic the knowledge within him. There in the corner stood the fine cherrywood desk he had made. Every bit of the polished dark furniture, every section had been sanded and grained and carved by his hands. And all the fabricing—rugs and pillows, delicate covers and hangings—all Amelia's handiwork.
They two, starting from bare black earth had built this home, foundations and beams, studs and floorboards, shingles and shutters, outside and in, their work, their love.
He thrust out his hands, and moved in blind panic to the arch of the kitchen.
Amelia looked up from the work table. The soft tan of sun was deep on her cheeks, and her clear green eyes kindled at sight of him.
"I'm up to the last bouquet," she smiled and indicated the straw basket that was full of neatly tied herbs ready for Jim to take to the market.
His long-drawn breath was a silent prayer: "Letme never forgetthe spiceofthisroom,themorninglightonherdarkcurledhair . "
Then he groaned. With a stride he caught her warm curved body to him. In her hands the last bouquet was crushed between them, filling his nostrils with fragrance of thyme and mint and coriander leaves.
At last he held her away, his hands tight on her shoulders, bare and brown in the brief sundress.
"There are men coming for me," he said. "I've got to run and hide. They'll search, and wait, but eventually they'll leave. Then I'll come back. But we'll have to go away from here. We'll have to start again under other names, somewhere else."
Her anxious eyes searched his face. "Jim, what have you done?"
"Done? Why, I've done nothing," he said.
And that was true. It was not what he had done. It was what he was.
"I'll go with you, Darling. We'll hide together."
He shook his head. "It's not you they want. I'll have a better chance alone."
He lifted the basket of herbs. "When they come tell them I've just gone to the market."
"But where willyou go?"
"Out to the hills. I'll come back when they're gone."
He kissed her quickly, and went out knowing she was standing at the back door threshold, straining to see him till he could no longer be seen....
There was a swift brook in the woods beyond his cultivated acres. Into it he scattered the herbs to be dispersed by the dancing water. The basket he broke in his strong brown hands and sent the pieces after the flowers. Then he took off his shoes, cuffed his pants to his knees and waded across the brook to the other side.
Pine nettles and small twigs gave under his stride. He never felt the pain of angled stones where he trod. He walked a long time, without stopping, and his breath did not become labored though his path was always upward.
When he reached the clear crest of the hill he looked down to the patchwork valley where he lived. He saw his house, a green-topped miniature fashioned like a jewel in the pastoral setting.
But the flaw in the jewel was the ominous helicar at the gate of the house.
He sat in the tall grass, pulling his knees up to his chest. He clasped his hands about his legs, and prepared himself for the long vigil ahead.
The sky became colored with sunset tints. He saw all the beauty without lifting his head. Cool breezes of dusk blew upon him. The sky became darker; the moon increased in brilliance. In the moonlight the metal car was silvered. During all the hours he had watched not one figure had emerged from the house. In a waiting game they were persistent and tireless.
He rose at last, and stretched. He saw his hands before him, taught and strong, finely formed. On impulse he rubbed his chin, touched his cheek. How smooth and hard the flesh, how bronzed his powerful body.
Suddenly he raised himself, stretching as high as he could, feeling the pull of well coordinated muscles. He smiled almost joyfully. His was a body to prize.
He whirled and started to run noiselessly, and without hurry over the tabled clearing. Where the terrain sloped he did not brake his speed, and momentum carried him faster down among the slim white birches, and fragrant firs.
He ran like a football player, in and out among the trees, leaping boulders and small streams, or plunging recklessly into crystal cold waters.
Where else was there another such as he—to run and run, and never tire? To dodge and twist, and speed over rough stones with no pain lancing up through him?
Was it his strength they feared?
"They'll never catch me," he vowed.
Then he went back up the sloping hill to his post at the crest.
The helicar was no longer standing in the road, and the lights of his house had ceased to shine. Only the moon flushed out shadows of the trees, and his silent dwelling. Theyaregone, he thought.