Issuu on Google+

9RO,VV<HDU

,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

w w w .a sd fjo ur na ls .

FaissalMILI,ManelHAMDI  AppliedEconomicsandSimulation, FacultyofManagementandEconomicSciences,5100Mahdia,Tunisia

co m

ComparativeStudyofExpansionFunctionsfor EvolutionaryHybridFunctionalLinkArtificial NeuralNetworksforDataMiningand Classification

InternationalFinanceGroupTunisia, FacultyofManagementandEconomicSciencesofTunis,Tunisia

 Abstractâ&#x20AC;&#x201D;Thispaperpresentsacomapraisonbetweendifferentexpansionfunctionforaspecificstructure of neural network as the functional link artificial neural network (FLANN). This technique has been employed for classification tasks of data mining. In fact, there are a few studies that used this tool for solving classification problems, and in the most case, the trigonometric expansion function is the most used. In this present research, we propose a hybrid FLANN (HFLANN) model, where the optimization process is performed using 3 known population based techniques such as genetic algorithms, particle swarm and  differential evolution. This model will be empirically compared using different expansion functionandthebestfunctiononewillbeselected.. KeywordsÇŚExpansion function; Data mining; Classification; Functional link artificial neural network; geneticalgorithms;Particleswarm;Differentialevolution.

fro m

I.Introduction

D ow nl oa de d

Classificationtaskisaveryimportanttopicindatamining.Alotofresearch([1],[2],[3])hasfocusedonthe fieldoverthelasttwodecades.TheDataminingisaknowledgediscoveryprocessfromlargedatabases.The extractedknowledgewillbeusedbyahumanuserforsupportingadecisionthatistheultimategoalofdata mining.Therefore,classificationdecisionisouraiminthisstudy.Avariousclassificationmodelshavebeen used in this regard. M. James [4] has employed a linear/quadratic discriminates techniques for solving classification problems. Another procedure has been applied using decision trees ([5], [6]). In the same context, Duda et al. [7] have proposed a discriminant analysis based on the Bayesian decision theory. Nevertheless,thesetraditionalstatisticalmodelsarebuiltmainlyonvariouslinearassumptionsthatwillbe necessarysatisfied.Otherwise,wecannotapplythesetechniquesforclassificationtasks.Toovercomethe disadvantage,artificialintelligenttoolshavebeenemergedtosolvedataminingclassificationproblems.For this purpose, genetic algorithms models were used [8]. In a recent research, Zhang ([9], [10]) have introducedtheneuralnetworkstechniqueasapowerfulclassificationtool.Inthesestudies,heshowedthat neuralnetworkisapromisingalternativetoolcomparedtovariousconventionalclassificationtechniques. Inamorerecentliterature,aspecificstructureofneuralnetworkhasbeenemployedforclassificationtask ofdataminingasthefunctionallinkartificialneuralnetwork(FLANN).Infact,thereareafewstudies([11], [12],[13])usedthistoolforsolvingclassificationproblems. In this present research, we propose a hybrid FLANN (HFLANN) model based on three metaheuristics population based optimization tools such: genetic algorithms (GAs), particle swarm optimization (PSO) anddifferentialevolution.Thismodelwillbecomparedusingdifferentexpansionfunctionandthebestone willbeselected.

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


9RO,VV<HDU

,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

II.ConceptsandDefinition A.PopulationBasedAlgorithms

co m

Populationbasedalgorithmsareclassedasacomputationalintelligencetechniquesrepresentingaclassof robustoptimizationones.Thesepopulationbasedonesmakeuseofapopulationofsolutioninthesame timebasedonnaturalevolution.

1)Geneticalgorithms

w w w .a sd fjo ur na ls .

Many population based algorithms are presented in the literature such evolutionary programming [14], evolutionstrategy[15],geneticalgorithms[16],geneticprogramming[17],AntColony[18],particleswarm [19] and differential evolution [20]. These algorithms differ in selection, offspring generation and replacement mechanisms. Genetic algorithms, particle swarm and differential evolutions represent the mostpopularones.

Geneticalgorithms(GAs)aredefinedasasearchtechniquethatwasinspiredfromDarwinianTheory.The idea is based on the theory of natural selection. We assume that there is a population composed with different characteristics. The stronger will be able to survive and they pass their characteristics to their offspringâ&#x20AC;&#x2122;s. Thetotalprocessisdescribedasfollows:

1ÇŚGeneraterandomlyaninitialpopulation;

2ÇŚEvaluatethispopulationusingthefitnessfunction;

3ÇŚApplygeneticoperatorssuchselection,crossover,andmutation;

2)Particleswarm

fro m

4ÇŚTurntheprocessâ&#x20AC;&#x153;EvaluationCrossovermutationâ&#x20AC;?untilreachingthestoppedcriteriafixedinprior.

D ow nl oa de d

Presentedin1995byL.KennedyandR.Eberhart[19],particleswarmoptimization(PSO)representsoneof the most known populationÇŚbased approaches, where particles change their positions with time. These particlesflyaroundinamultidimensionalsearchspace,andeachparticleadjustsitspositionaccordingto itsownexperienceandtheexperienceoftheirneighboring,makinguseofthebestpositionencounteredby itself and its neighbors. The direction of a particle is defined by the set of neighboring and its correspondenthistoryofexperience. Anindividualparticleiiscomposedofthreevectors: ÇŚItspositionintheVÇŚdimensionalsearchspace X$$$$ÔŚ=(X$$,X$$,â&#x20AC;Ś,X$$)$

ÇŚThebestpositionthatithasindividuallyfound

P$$$ÔŚ=(P$$,P$$,â&#x20AC;Ś,P$$)$

ÇŚItsvelocityV$$$$ÔŚ=(V$$,V$$,â&#x20AC;Ś,V$$)

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


9RO,VV<HDU

,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

Particles were originally initialized in a uniform random manner throughout the search space; velocity is alsorandomlyinitialized.

.V$$=É?â&#x20AC;Ť(×&#x203A;â&#x20AC;ŹWâ&#x20AC;Ť×&#x203A;â&#x20AC;ŹV$$+Câ&#x20AC;Ť×&#x203A;â&#x20AC;ŹÉ&#x2030;$(P$$Ď&#x2021;X$$)+Câ&#x20AC;Ť×&#x203A;â&#x20AC;ŹÉ&#x2030;$$P$$Ď&#x2021;X$$$)(1)

Whereintheoriginalequations:

w w w .a sd fjo ur na ls .

X$$=X$$+V$$(2)

co m

These particles then move throughout the search space by a fairly simple set of update equations. The algorithmupdatestheentireswarmateachtimestepbyupdatingthevelocityandpositionofeachparticle ineverydimensionbythefollowingrules:

C is a constant with the value of 2.0 É&#x2030;$ and É&#x2030;$ are independent random numbers uniquely generated at everyupdateforeachindividualdimension(n=1toV). P$$isthebestpositionfoundbytheglobalpopulationofparticle. P$$isthebestpositionfoundbyanyneighboroftheparticle. W:theweight É?:theconstrictionfactor. 3)Differentialevolution

Proposed by Storn and Price in 1995 [20], differential evolution represents a new floating evolutionary algorithm using a special kind of differential operator. Easy implementation and negligible parameter tuningmakesthisalgorithmquitepopular.

fro m

Likeanyevolutionaryalgorithm,differentialevolutionstartswithapopulation.Differentialevolutionisa smallandsimplemathematicalmodelofabigandnaturallycomplexprocessofevolution.So,itiseasyand efficient. Firstly,therearefiveDEstrategies(orschemes)thatwereproposedbyR.StornandK.Price[20]:

D ow nl oa de d

â&#x20AC;˘SchemeDE/rand/1: É&#x;=x1+F*(x2Ď&#x2021;x3)(3)

â&#x20AC;˘SchemeDE/rand/2: É&#x;=x5+F*(x1+x2Ď&#x2021;x3Ď&#x2021;x4)(4) â&#x20AC;˘SchemeDE/best/1: É&#x;=xbest+F*(x1Ď&#x2021;x2)(5)

â&#x20AC;˘SchemeDE/best/2: É&#x;=xbest+F*(x1+x2Ď&#x2021;x3Ď&#x2021;x4)(6)

â&#x20AC;˘SchemeDE/randÇŚtobest/1: É&#x;=x+É?*(xbestĎ&#x2021;x1)+F*(x2Ď&#x2021;x3)(7)

Later,twomorestrategieswereintroduced[21]. Wepresentthetrigonometricschemedefinedby:

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


9RO,VV<HDU

,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

É&#x;=(x1+x2+x3)/3+(p2Ď&#x2021;p1)*(x1Ď&#x2021;x2) +(p3Ď&#x2021;p2)*(x2Ď&#x2021;x3)+(p1Ď&#x2021;p3)*(x3Ď&#x2021;x1)(8) pi=|f(xi)/(f(x1)+f(x2)+f(x3))|,i=1,2,3;(9)

xdefinetheselectedelement x1,x2,x3,x4andx5representrandomgeneratedelementsfromthepopulation.

w w w .a sd fjo ur na ls .

Manyothersschemescanbefoundintheliterature[20].

co m

Fdefinetheconstrictionfactorgenerallytakenequalto0.5

B.FunctionalLinkArtificialNeuralNetworks

TheFLANNarchitecturewasoriginallyproposedbyPaoetal.[22].Thebasicideaofthismodelistoapply an expansion function which increases the input vector dimensionality. We say that the hyperÇŚplanes generatedprovidegreaterdiscriminationcapabilityintheinputpatternspace.Byapplyingthisexpansion, weneednâ&#x20AC;&#x2122;ttheuseofthehiddenlayer,makingthelearningalgorithmsimpler.Thus,comparedtotheMLP structure,thismodelhastheadvantagetohavefasterconvergencerateandlessercomputationalcost. Theconventionalnonlinearfunctionalexpansionswhichcanbeemployedaretrigonometric,powerseries, Chebyshev polynomials or Chebyshev Legendre polynomials type. R. Majhi et al. [23], shows that use of trigonometricexpansionprovidesbetterpredictioncapabilityofthemodel.Hence,inthepresentcase,we aimtovalidatethebestexpansionfunctionfortheproposedmodel. LeteachelementoftheinputpatternbeforeexpansionberepresentedasX(i),1<i<Iwhereeachelement x(i) is functionally expanded as Zn(i) , 1 < n < N , where N = number of expanded points for each input element.Inthisstudy,wetakeN=5. I=thetotalnumberoffeatures

fro m

Aspresentedinfigure1,theexpansionofeachinputpatternisdoneasfollows. Z1(i)=X(i),Z2(i)=f1(X(i)),....,Z5(i)=f5(X(i))(10)

D ow nl oa de d

Theseexpandedinputsarethenfedtothesinglelayerneuralnetworkandthenetworkistrainedtoobtain thedesiredoutput.Differentexpansionfunctionwillbedescribednext. C.Expansionfunctions

Four expansion function will be used in this work such, trigonometric, the polynomial, the Legendre polynomialandthepowerseries.Differentcharacteristicsarepresentedinthe4graphics. 

 Figure1.Trigonometricfunctionalexpansionofthefirstelement

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

w w w .a sd fjo ur na ls .

co m

9RO,VV<HDU



Figure2.Chebyshevpolynomialsfunctionalexpansionofthefirstelement



D ow nl oa de d

fro m

Figure3.ChebyshevLegendrepolynomialsfunctionalexpansionofthefirstelement



Figure4.Powerseriesfunctionalexpansionofthefirstelement

III.HybridFlannDescription TheproposedhybridFLANNisbasedonevolutionaryalgorithmsasgeneticalgorithms,particleswarmand differentialevolution.

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


9RO,VV<HDU

,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

A.ResamplingTechnique: Inordertoavoidoverfitting,weusethe(2*5)Kfoldcrossvalidationresamplingtechnique.Weproceedas follows:

w w w .a sd fjo ur na ls .

B.Generation

co m

We divide initial database into 5 folds (K=5) where each one contain the same repartition of classes. For example,ifinitialpopulationcontains60%ofclass1and40%ofclass2,thenalltheresultedKfoldsmust havethesamerepartition.

Webegintheprocessbygeneratingrandomlyinitialsolution.Weexecutepartialtrainingusingdifferential evolutioninordertoimproveinitialstate. C.FitnessFuctionandEvaluation

In order to evaluate each solution, two criterions are used such the mean square error (MSE) and the misclassificationerror(MCE)rate.IfwehavetocomparesolutionsAandB,weapplythefollowingrules:A ispreferredtoBIfandonlyifMCE(A)<MCE(B)OrMCE(A)=MCE(B)andMSE(A)<MSE(B). D.Selection

ManyselectionsaredefinedintheliteraturesuchtheRoulettewheelmethod,theN/2elitistmethodand the tournament selection method. The last method will be used here. The principle is to compare two solutions,andthebestonewillbeselected. N/2elitistisusedatthebeginningoftheprocessinordertoselect50%ofgeneratedsolution. E.Crossover

fro m

Twoparentsareselectedrandomlyinordertoexchangetheirinformation.Twocrossoversareappliedand describedasfollows: 1) Crossover 1 (over input feature): An input feature is chosen randomly to exchange his correspondent weightbetweentheselectedtwoparents.

D ow nl oa de d

2)Crossover2(overoutputnodes):Anoutputischosenrandomlytoexchangehiscorrespondentweight. 3) Crossover 3 (Crossover over connection): A connection position is chosen randomly and his correspondentweightisexchangedbetweenthetwoparents. F.Mutation

1)Mutation1(overconnection)

A connection position is chosen randomly and his correspondent weight has been controlled. If this connectionisconnected,hiscorrespondentweightisdisconnectedbysettinghisvalueequaltozero.Else, thisconnectionisconnected.

2)Mutation2(overoneinputfeature)

An input feature is chosen randomly and his correspondent weights have been controlled. If this input featureisconnected(thereisatleastoneweightsofhiscorrespondentonesisdifferentfromzero),itwill

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


9RO,VV<HDU

,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

be disconnected by putting all his entire weight equal to zero. Else if this input feature is totally disconnected,itwillbeconnectedtherebygeneratingweightsdifferentfromzero. 3)Mutation3(overtwoinputfeature)

4)Mutation4(overthreeinputfeature)

w w w .a sd fjo ur na ls .

Inthismutation,thesameprincipleisusedforthreeinputfeatures.

co m

Wedothesamelikemutation2butheresimultaneouslyforthetwoselectedfeatures.

Wenotethatmanyinputfeaturesconnectionanddisconnectioncanbeexecutedinthesametimewhen having a large number of features. This crossover helps to remove undesirable features from our classificationprocessandcanimprovethefinalperformanceprocess. G.Particleswarmoptimization(PSO)

Inthepresentedpaper,wedefinethreePSOmodelbasedonthenotionofneighbor. 1)PSObasedonresultedgeneticoffspringâ&#x20AC;&#x2122;s:First,weapplygeneticoperators.Eachoffspringthat improveourfitnessfunctiondefineaneighbor,andusedinequation(1).

2) PSO based on Euclidian distance: For each particle, we compute the Euclidian distance between this particle and the rest of the population. Next we choose the five nearest particles based on this distance. Fromtheselectedsubsetofneighbors,wechoosethebest onewhichhasthebestfitnessvalue.Thisselectedonedefinesourneighbortobereplacedinequation(1). 3) PSO based on the last best visited solution: In this case, each particle flies and memorizes his best reachedsolution.Thismemorydefinestheneighbortobeusedinequation(1).

fro m

H.Differentialevolution

Inthiswork,weproceedasfollows:

D ow nl oa de d

ÇŚFirst,foreachcandidatex,wegeneratefiverandomsolutionx1,x2,x3,x4andx5. ÇŚNextweapplysevenchosenschemesasfollows: DE1:SchemeDE/direct/1: É&#x;=x+F*(x2â&#x20AC;&#x201C;x1)(11)

DE2:SchemeDE/best/1: É&#x;=xbest+F*(x2â&#x20AC;&#x201C;x1)(12)

DE3:SchemeDE/best/1: É&#x;=xbest+F*(x3â&#x20AC;&#x201C;x2)(13)

DE4:SchemeDE/best/1: É&#x;=xbest+F*(x3â&#x20AC;&#x201C;x1)(14)

DE5:SchemeDE/best/2: É&#x;=xbest+F*(x1+x2Ď&#x2021;x3Ď&#x2021;x4)(15)

DE6:SchemeDE/rand/2: É&#x;=x5+F*(x1+x2Ď&#x2021;x3Ď&#x2021;x4)(16)

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


9RO,VV<HDU

,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

DE7:withTrigonometricMutation: É&#x;=(x1+x2+x3)/3+(p2Ď&#x2021;p1)*(x1Ď&#x2021;x2)+(p3Ď&#x2021;p2)*(x2Ď&#x2021;x3)+(p1Ď&#x2021;p3)*(x3Ď&#x2021;x1)(17) pi=|f(xi)/(f(x1)+f(x2)+f(x3))|,i=1,2,3;(18)

co m

I.Stoppingcriterion: Theprocessturnsinacycleuntilreachingamaximumnumberofepochswithoutanyimprovement.Wefix themaximumnumberofepochsequalto30epochs.

w w w .a sd fjo ur na ls .

IV.ExperimentalStudies:

11realÇŚworlddatabaseswereselectedtheretobeusedinsimulationworks.TheyarechosenfromtheUCI repositorymachinelearning,whichiscommonlyusedtobenchmarklearningalgorithms[24]. We compare the results of the proposed hybrid FLANN (HFLANN) with FLANN based on the gradient descentalgorithm.Next,Comparisonwithotherclassifierswillbedone. A.DescriptionofTheDatabases

A brief description of used databases for experimental setup is presented in table I. Num. is the numeric features, Bin. is the binary ones, and Nom. is the nominal inputs that mean discrete with three or more distinctlabels.

D ow nl oa de d

fro m

TableI.SummaryofTheDatasetUsedinSimulationStudies

 B.Initialpopulationimprovement:

Arandomgeneratedpopulationwillbegeneratedrandomlyandtheirperformanceispresentedincolumn 2 of table II. Random generation gives worst results needing some initial improvement. For this aim, we propose to use two prior improving algorithms: the backÇŚpropagation, the differential evolution, and a mixedbackÇŚpropagationdifferentialevolutionone.Fromcolumn3andcolumn4,weobservethattheback propagation one has a better improving performance than the differential evolution. In the last column,

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


9RO,VV<HDU

,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

mixedbackpropagationÇŚdifferentialevolutionresultsarepresented.Comparedtosinglealgorithmresults, themixedalgorithmgivesthebetterresultanditwebeusedinourprocessasapriorimprovingalgorithm. TableII.SummaryofTheDatasetUsedinSimulationStudies RandomGenerationwith BP 6,6667 0 0 0 6,6667 6,6667 6,6667 0 6,6667 0 0 6,6667 6,6667 6,6667 6,6667 0 0 6,6667 6,6667 0 3,666685

RandomGenerationwith DE 23,3333 41,6667 36,6667 48,3333 21,6667 25 25 26,6667 16,6667 25 25 30 30 21,6667 38,3333 33,3333 41,6667 30 26,6667 13,3333 29,000005

20,99774004

3,402802251

8,8921693

MixedBP DE 0 3,3333 1,6667 3,3333 0 1,6667 0 0 1,6667 3,3333 0 1,6667 1,6667 0 0 0 0 3,3333 0 0 1,083335

w w w .a sd fjo ur na ls .

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Mean Ecart Type

Random Generation 20,57096028 41,6667 65 65 38,3333 68,3333 58,3333 83,3333 76,6667 96,6667 45 38,3333 30 40 25 50 35 26,6667 25 40 47,44521301

fro m



co m



1,354538289

C.Convergencetest:

D ow nl oa de d

In order to test the convergence of the proposed hybrid FLANN, a comparison will be done with trained FLANNusingthebackÇŚpropagationalgorithm.Resultsarepresentedinfigure5andfigure6.Comparisonis donebasedontherequiredtimeandnumberofepochsforconvergence. Fromfigure5,wefindthatourprocessneedslessthan200seconds20epochstoconverge.Figure6present resultsforFLANNbasedonbackÇŚpropagation.Thismodelrequireslessthan150secondsand15epochsto converge. TheproposedhybridFLANNhasastrongabilitytoconvergefastandrequiresapproximatelythesametime andepochsthanFLANNbasedbackÇŚpropagation.

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

w w w .a sd fjo ur na ls .

co m

9RO,VV<HDU



fro m

A.MSEvsTime

 B.MSEvsepochs

D ow nl oa de d

Figure5.TheMSEHybridFLANNresultsvs.timeandepochsappliedtotheirisdatabase

 A.MSEvsTime

 

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

w w w .a sd fjo ur na ls .

co m

9RO,VV<HDU



B.MSEvsepochs

Figure6.TheMSEFLANNbasedbackÇŚpropagationresultsvs.timeandepochsappliedtotheirisdatabase D.Comparativestudy:

TableIII.AverageComparativePerformanceofHflannBasedDifferentExpansionFunction

0,8933 0,7829 0,9298 0,6536 0,5935 0,6036 0,9035 0,5392 0,6279 0,3463 0,4163

D ow nl oa de d

IRIS VOTING BREAST PRIMA CREDIT BALANCE WINE BUPA ECOLI GLASS ZOO Meanof Means Meanof Standard Deviation Range

0,96667 0,94498 0,9599 0,73964 0,85316 0,89779 0,96111 0,69328 0,81661 0,61769 0,85606

fro m

Database

0,94667 0,94049 0,96704 0,73332 0,83698 0,88669 0,92582 0,66101 0,77389 0,61047 0,85126

PowerSeries Hybrid FLANNwith LocalSearch 0,94667 0,94001 0,96575 0,738 0,67253 0,86359 0,97222 0,69849 0,8219 0,53167 0,88404

Legend Chebytchev HybridFLANN withLocalSearch 0,95333 0,95877 0,94259 0,75142 0,84962 0,83266 0,90905 0,62941 0,68185 0,46039 0,78934

Trigonometric Chebytchev FLANN HybridFLANNwith HybridFLANN BasedBP LocalSearch withLocalSearch

0,66272

0,84608

0,83033

0,82135

0,79619



0,05276

0,05419

0,06811

0,06124



1

3

4

2



TableIIIpresentresultsoftheproposedmodelusingfourdifferentexpansionfunctions.Wefindthatour modelgivesbetterresultsthattheFLANNbasedbackÇŚpropagationalgorithms.

Bycomparingdifferentexpansionfunction,wefindthatthetrigonometricexpansionfunctionisthebest one having the best mean of performance (0.84608), and thelittle mean ofstandard deviation (0.05276). Thisexpansionfunctiongivesthebestresultsover7databasesfrom11. Wecanconcludethatthetrigonometricexpansionfunctionisthebestone.

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


9RO,VV<HDU

,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

V.Conclusion AHFLANNwasproposedbasedonthreepopulationsbasedalgorithmssuchgeneticalgorithms,differential evolution and particle swarm. This classifier shows his ability to converge faster and gives better performancethanFLANNbasedonbackÇŚpropagation.

co m

Basedonourexperimentation,andcomparedtoothersexpansionfunction,thetrigonometriconeisfound thebestone.

w w w .a sd fjo ur na ls .

Infuturework,wecanaddawrapperapproachabletodeleteautomaticallyirrelevantfeatures.Wecanalso applytheHEFLANNtoothersdataminingproblemssuchprediction. Othersevolutionaryalgorithmscanbeincludedintheproposedprocessinordertoperformbetterresults. Otherscomparisoncriteriacanbeusedsuchtheneededspeedandtherobustnessofthealgorithm

References

[1] R .Agrawal, T. Imielinski and A. Swami, â&#x20AC;&#x153;Database mining: A performance perspectiveâ&#x20AC;?. IEEE Trans. KnowledgeDataEng.,5,pp.914ÇŚ925(1993). [2] U. M. Fayyad, PiatetskyÇŚShapiro, G., and Smyth, P., â&#x20AC;&#x153;From data mining to knowledge discovery: An overviewâ&#x20AC;?.InU.M.Fayyad,G.PiatetskyÇŚShapiro,&P.Smyth(Eds.),Advancesinknowledgediscoveryand datamining,pp.1â&#x20AC;&#x201C;34,.MenloPark,CA:AAAIPress,1996. [3]H.P.Kriegel,etal.,â&#x20AC;&#x153;Futuretrendsindataminingâ&#x20AC;?.DataMiningandKnowledgeDiscovery,15(1),87â&#x20AC;&#x201C;97. Netherlands:Springer(2007). [4]M.,James,â&#x20AC;&#x153;ClassificationAlgorithmsâ&#x20AC;?.Wiley,1985.

fro m

[5] L. Breiman, , J.H. Friedman, R.A. Olshen and C.J. Stone, â&#x20AC;&#x153;Classification and Regression Learningâ&#x20AC;?. MorganKaufman,1984. [6]H.P.Kriegel,etal.,â&#x20AC;&#x153;Futuretrendsindataminingâ&#x20AC;?.DataMiningandKnowledgeDiscovery,15(1),87â&#x20AC;&#x201C;97. Netherlands:Springer(2007).

D ow nl oa de d

[7]R.O.Duda,HartPEandStorkD.G.,â&#x20AC;&#x153;Patternclassificationâ&#x20AC;?.Wiley,NewYork,2001.

[8]D.E.Goldberg,â&#x20AC;&#x153;Geneticalgorithmsinsearch,optimizationandmachinelearningâ&#x20AC;?.MorganKaufmann ,1989. [9] G. P., Zhang, â&#x20AC;?Neural networks for classificationâ&#x20AC;?, A survey. IEEE Transactions on Systems, Man, CyberneticsÇŚPartC:ApplicationandReviews,30(4),pp.451â&#x20AC;&#x201C;461,2000. [10] G. P Zhang, â&#x20AC;&#x153;Avoiding pitfalls in neural network researchâ&#x20AC;?. IEEE Transactions on Systems, Man and CyberneticsÇŚPart C: Applications and Reviews, 37(1), pp. 3â&#x20AC;&#x201C;16 , 2007. M. Young, The Technical Writer's Handbook.MillValley,CA:UniversityScience,1989. [11]R.Majhi,G.PandaandG.Sahoo,â&#x20AC;&#x153;DevelopmentandperformanceevaluationofFLANNbasedmodelfor forecastingofstockmarketsâ&#x20AC;?,ExpertSystemswithApplications36,pp.6800â&#x20AC;&#x201C;6808,2009. [12] S. Dehuri and Cho S.B. ,â&#x20AC;?A hybrid genetic based functional link artificial neural network with a statisticalcomparisonofclassifiersovermultipledatasetsâ&#x20AC;?.NeuralComput&Applic(19),pp.317ÇŚ328.,2010a.

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


9RO,VV<HDU

,QWO-UQORQ+XPDQ0DFKLQH,QWHUDFWLRQ

[13] S. Dehuri and Cho S.B.,â&#x20AC;?Evolutionarily optimized features in functional link neural network for classificationâ&#x20AC;?.ExpertSystemswithApplications(37),pp.4379â&#x20AC;&#x201C;4391,2010b. [14] L., Fogel, J. Owens and J. Walsh, â&#x20AC;&#x153;Artificial Intelligence through Simulated Evolutionâ&#x20AC;?. John Wiley, Chichester,1966.

co m

[15]D.E.Goldberg,â&#x20AC;&#x153;Geneticalgorithmsinsearch,optimizationandmachinelearningâ&#x20AC;?.MorganKaufmann ,1989.

w w w .a sd fjo ur na ls .

[16]J.,Holland,â&#x20AC;&#x153;Adaptationinnaturalandartificialsystemsâ&#x20AC;?.Univ.ofMichiganPress,AnnArbor,1975.

[17] J. Koza, , â&#x20AC;&#x153;Genetic programming on the programming of computers by means of natural selectionâ&#x20AC;?, CambridgeMA:MITPress,1992. [18]M.Dorigo,,Stutzle,T.,â&#x20AC;?AntColonyOptimizationâ&#x20AC;?.MITPress,Cambridge.,2004.

[19] J., Kennedy Eberhart R., â&#x20AC;&#x153;Particle Swarm Optimizationâ&#x20AC;?, In Proceedings of IEEE International ConferenceonNeuralNetworks,pp.1942â&#x20AC;&#x201C;1948,1995. [20] R. Storn Kenneth P. â&#x20AC;&#x153;Differential evolution A simple and efficient adaptive scheme for global optimization over continuous spacesâ&#x20AC;?. Technical Report TR, pp. 95ÇŚ012, International Computer Science Institute,Berkeley,CA,1995. [21] F. HuiÇŚY. and J. Lampinen,â&#x20AC;&#x153;A trigonometric mutation approach to differential evolutionâ&#x20AC;?. In K. C. Giannakoglou, D. T. Tsahalis, J. Periaux, K. D. Papailiou, & T. Fogarty, editors, Evolutionary Methods for Design Optimization and Control with Applications to Industrial Problems, pp. 65â&#x20AC;&#x201C;70, Athens, Greece. InternationalCenterforNumericalMethodsinEngineering(Cmine),2001.

fro m

[22]Y.ÇŚH.Pao,S.M.PhillipsandD.J.Sobajic,â&#x20AC;&#x153;NeuralÇŚnetcomputingandintelligentcontrolsystemsâ&#x20AC;?.Int.J. Contr.,56,pp.263ÇŚ289,1992. [23]R.Majhi,G.PandaandG.Sahoo,â&#x20AC;&#x153;DevelopmentandperformanceevaluationofFLANNbasedmodel forforecastingofstockmarketsâ&#x20AC;?,ExpertSystemswithApplications36,pp.6800â&#x20AC;&#x201C;6808,2009.

D ow nl oa de d

[24]C.L.Blake,.andMerz,C.J.,â&#x20AC;&#x2DC;UCIRepositoryofmachinelearningdatabasesâ&#x20AC;&#x2122;,emIrvine,CA:Universityof California, department of information and Computer Science. Available onÇŚline at : http//www.ics.uci.edu/~mlearn/ML.Repository.html.,1998.

`

MRXUQDOV#DVGIUHVLQ

ZZZDVGIMRXUQDOVFRP

3DJHRI


Ijhmi0006