资源预览内容
第1页 / 共8页
第2页 / 共8页
第3页 / 共8页
第4页 / 共8页
第5页 / 共8页
第6页 / 共8页
第7页 / 共8页
第8页 / 共8页
亲,该文档总共8页全部预览完了,如果喜欢就下载吧!
资源描述
editsvmtraineditsvmclassifyeditsvmpredictfunctionsvm_struct,svIndex=svmtrain(training,groupnames,varargin)%SVMTRAINtrainsasupportvectormachineclassifier%SVMStruct=SVMTRAIN(TRAINING,GROUP)trainsasupportvectormachine%classifierusingdataTRAININGtakenfromtwogroupsgivenbyGROUP.%SVMStructcontainsinformationaboutthetrainedclassifierthatis%usedbySVMCLASSIFYforclassification.GROUPisacolumnvectorof%valuesofthesamelengthasTRAININGthatdefinestwogroups.Each%elementofGROUPspecifiesthegroupthecorrespondingrowofTRAINING%belongsto.GROUPcanbeanumericvector,astringarray,oracell%arrayofstrings.SVMTRAINtreatsNaNsoremptystringsinGROUPas%missingvaluesandignoresthecorrespondingrowsofTRAINING.%SVMTRAIN(.,KERNEL_FUNCTION,KFUN)allowsyoutospecifythekernel%functionKFUNusedtomapthetrainingdataintokernelspace.The%defaultkernelfunctionisthedotproduct.KFUNcanbeoneofthe%followingstringsorafunctionhandle:%linearLinearkernelordotproduct%quadraticQuadratickernel%polynomialPolynomialkernel(defaultorder3)%rbfGaussianRadialBasisFunctionkernel%mlpMultilayerPerceptronkernel(defaultscale1)%functionAkernelfunctionspecifiedusing,%forexampleKFUN,orananonymousfunction%Akernelfunctionmustbeoftheform%functionK=KFUN(U,V)%Thereturnedvalue,K,isamatrixofsizeM-by-N,whereUandVhaveM%andNrowsrespectively.IfKFUNisparameterized,youcanuse%anonymousfunctionstocapturetheproblem-dependentparameters.For%example,supposethatyourkernelfunctionis%functionk=kfun(u,v,p1,p2)%k=tanh(p1*(u*v)+p2);%Youcansetvaluesforp1andp2andthenuseananonymousfunction:%(u,v)kfun(u,v,p1,p2).%SVMTRAIN(.,POLYORDER,ORDER)allowsyoutospecifytheorderofa%polynomialkernel.Thedefaultorderis3.%SVMTRAIN(.,MLP_PARAMS,P1P2)allowsyoutospecifythe%parametersoftheMultilayerPerceptron(mlp)kernel.Themlpkernel%requirestwoparameters,P1andP2,whereK=tanh(P1*U*V+P2)andP1%0andP20.DefaultvaluesareP1=1andP2=-1.%SVMTRAIN(.,METHOD,METHOD)allowsyoutospecifythemethodused%tofindtheseparatinghyperplane.Optionsare%QPUsequadraticprogramming(requirestheOptimizationToolbox)%LSUseleast-squaresmethod%IfyouhavetheOptimizationToolbox,thentheQPmethodisthedefault%method.Ifnot,theonlyavailablemethodisLS.%SVMTRAIN(.,QUADPROG_OPTS,OPTIONS)allowsyoutopassanOPTIONS%structurecreatedusingOPTIMSETtotheQUADPROGfunctionwhenusing%theQPmethod.Seehelpoptimsetformoredetails.%SVMTRAIN(.,SHOWPLOT,true),whenusedwithtwo-dimensionaldata,%createsaplotofthegroupeddataandplotstheseparatinglinefor%theclassifier.%Example:%Loadthedataandselectfeaturesforclassification%loadfisheriris%data=meas(:,1),meas(:,2);%ExtracttheSetosaclass%groups=ismember(species,setosa);%Randomlyselecttrainingandtestsets%train,test=crossvalind(holdOut,groups);%cp=classperf(groups);%Usealinearsupportvectormachineclassifier%svmStruct=svmtrain(data(train,:),groups(train),showplot,true);%classes=svmclassify(svmStruct,data(test,:),showplot,true);%Seehowwelltheclassifierperformed%classperf(cp,classes,test);%cp.CorrectRate%SeealsoCLASSIFY,KNNCLASSIFY,QUADPROG,SVMCLASSIFY.%Copyright2004TheMathWorks,Inc.%$Revision:1.1.12.1$Date:2004/12/2420:43:35$%References:%1Kecman,V,LearningandSoftComputing,%MITPress,Cambridge,MA.2001.%2Suykens,J.A.K.,VanGestel,T.,DeBrabanter,J.,DeMoor,B.,%Vandewalle,J.,LeastSquaresSupportVectorMachines,%WorldScientific,Singapore,2002.%3Scholkopf,B.,Smola,A.J.,LearningwithKernels,%MITPress,Cambridge,MA.2002.%SVMTRAIN(.,KFUNARGS,ARGS)allowsyoutopassadditional%argumentstokernelfunctions.%setdefaultsplotflag=false;qp_opts=;kfunargs=;setPoly=false;usePoly=false;setMLP=false;useMLP=false;ifisempty(which(quadprog)useQuadprog=true;elseuseQuadprog=false;end%setdefaultkernelfunctionkfun=linear_kernel;%checkinputsifnargin0training(nans,:)=;g(nans)=;endngroups=length(groupString);ifngroups2error(Bioinfo:svmtrain:TooManyGroups,.SVMTRAINonlysupportsclassificationintotwogroups.nGROUPcontains%ddifferentgroups.,ngroups)end%convertto1,-1.g=1-(2*(g-1);%handleopt
收藏 下载该资源
网站客服QQ:2055934822
金锄头文库版权所有
经营许可证:蜀ICP备13022795号 | 川公网安备 51140202000112号