Sensor and Data fusionby hostile measures The context information is inhard tovoidableal-world application Therefore, the extraction of 'inforlements' fpictures is bytrivialtes for sedata and information fusion links with a sufficient bandwidth, small latency, and robustness againstg Moreover, the transformation of the serdata into arequires a precise space-time registration of the sensors, including their1 provides an overview of different aspects and their mutual interrelation Thebe located in di(collocated, distributed,e) producing measurements of the same or of a different type Fusion ofheterogeneous sensor data is of particular importance, such as the combination of kinematimeasurements with measured attributes providing information on the classes to whicjects belongs to In the context of defense and security applications especially, thedistinction between active and passive sensing is important since passivers enablemanagement that provide feedback via control orcommands to theiformation acquisition By this the surveillance objectioften be reachede efficiently, Context information is given, for example, by available knowledge on ther and object prohich is often quantitatively described by statistical models
Context knowledge is also environmental information on roads or topographical occlusions(GIS: Geographical Information Systems) Seen from a different perspective, contextge (eg doctrines, planning dabery(HUMINT: Human Intelligence) is also important information in the fusion
Advanced Sensor and Dynamics Models with an Application to Sensor Managemb Pl- wk k-ISk6-1WWk-PKk-l+H RK He)-th the Kalman Gain Matrix W*k-1 and the Innovation Covariance Matrix Sx k-1, given byPkk-1Hkkk-ISkPkkk-1Hi+nematicsAs an example, let us consider state vectors xk=(rk, rk), consisting of theobject position and velocity, and position measuremzk withza and the context information thatm object speed to be expectedgiven by xoo=(z0,0 ), Poo=diag[Ro, ax]dence, ie an expected but actuallIrement is to be treatedwithin the bayesimalism Let us first exclude falsements and assume that theterest are detected with a constant detection probability PD< 1
This problemhus identical with the previously discussed Kalman filtering excet each time tx available In this case, the underlying sensor model, i e the likelihoodfunction, has not only to describe the measurement process, characterized by themeasurement matrix H and the mence matrix R,, but also thedetection process, characterized by the detection probability Po 1 According to thisibilities: either the object was detected at time tyterpretation hypothesis ikUnder theption that the probabilities p(ik=1x)=Pp and P(ix=0x)=1- Po do not dependthe object state x, we obtain with &i=0 for ifj and 8 =1 for i=j the following likelihoodp(Zk, ng xk)=2P(Zk, nk, ik xk)∑Zk, nkik, xk)p(ik xk)Son(1-Pp)+51 m PD N(zk; Hxk, R)e(xk; Zk, Hk, Rk, Pp)With p(x 2)=Nx xe LL P&1), Equation 1 leads to the following conclusiofiltering resulting in P(x 2)Dwith xk and P&k giFor a negative sensor output (n= 0) the likelihood function is given by the constantPD This implies that prediction pdf is not modified in the filtering step: xkJ=X
Sensor and Data fusionP*lk- Pala-l According to the Kalman update equations this result can formally beofarge matrix RThe Bayesian formalism and the sensor model (likelihood function) obviously define howbe processedIn the case of well-separated objects in the presence of false returns and imperfect detection,data Z are also not longer uniquely interpretable
Let ik=0 denote the dainterpretation hypothesis that the object has not been detected at time tu, all sensor databeing false returns, whilel represents the hypothesis that the object has beendetected, Zk EZ being a object measurement, the remaining sensor data being false returnsvidently, ikI o is a set of mutually exclusive and exhaustive data interpretations Due tothe total probability theorem, the corresponding likelihood function is thus given by(x;2,m)=∑以(2k,nkk,x)(kxP(Zk nk, ik, xk)p(nk ik, xk) p(ik xk)Fov-PF(nk)(1-PD+IFovi-(n-pr(nk-1)[ PDI> N(z*: HK xk,R* )(38)(1-PD)pF+P∑M(;x,R)where we assumed a constant detection probability Po andurns equally distributedn the field of view fovi and poisson distributed in nunility of havingn false returns is given by pp(n)=ne"-ppIFovI withn densityd fov i denoting the voltf the field ofSee [22] for a more detailed discussiAccording to the eq1, this likelihood function implies that p(x 2) becorGaussian mixture, a weightedf gauwhose parametersobtained byexploiting the product formula (9)D Gaussian Mixtures and Multiple Hypothesis TrackingIn many applications, such as group target tracking with possibly unresolved memoving target tracking with STAP radar [21, or target tracking withphasedarray radar in the presence of jamming [10], the sensor mdescribed bylikelihood function of the type ((x: Zk na)a 2i-o P(ZA, n i x)p(ixx)(20] Suchalgorithms tntially characterizedtion hy potheses ix into account, are the basis for Mthesis TrackingMHTdfs thfrom Bayes Rule and a Gaussian mixture prediction,
Advanced Sensor and Dynamics Models with an Application to Sensor Managem(x2)=∑N(xk;x,P)epresents a track hypothesis, which is characterized by a sequence of data interpretationhypotheses i"(i, ik-l,), ie data interpretation historyThe structure of a Gaussian mixture for p(x 2)also occurs if an IMM prediction p(x 2-(see previous subsection) is updated by using a Gaussian likelihood function ((x: Z, H, R)(4: H x, R)according to Equation I and the product formula(equation 9)Filk-1Sdxk N(z∑Nx;x,Pkwhere the mixture parametersand piaplA Pilk-IM(zk: H* -, skik-lk-1(k-Hxkk-1)with the standard Kalman Gain and Innovation Covariance matricesWs,=H PAH+RkIMM filtering may thus be considered as a multiple 'model hypotheses tracking methodned IMM-MHT-approaches are discussed in the literature, eg
[23] See[34],[for an alternative treatment of the multiple hypothesis, multiple model tracking problermmary and Realization Aspectslyesian tracking algorithm is an iterative updating scheme for conditional probabilitdensity functionsall available knowledge on the kinematical statevectors x of the objects to be tracked at discrete instants of time fr The pdfs are conditioned byboth the sensor data 2 accumulated up to some time h, typically the current scan time, andharacteristics, object dynamvectors x are required, the related estimation process is referred to as prediction (h> h )andfiltering(f=t*) In the following the iterative calculation is illustrated schematicall
Sensor and Data fusioneditiop(xk-1|2k-1),p(xk|2-)p(xk|2k-1)P(xk2)of individual densities thatparticular data interpretatimodelses to be true This structure is a dience of the uncertain origin of thef the uncertaintyunderlyingProvidedties p(x 2)are calculated correctly, optimal estimateed related toBayesian scheme to extended objects and object clusters anddiction231Due to the uncertain origin of the sensor data, naively applied Bayesian tracking leads toents in theesulting from prediction and filtering are characterized by a finitay be fluctuating and even large for a while, but dohis is the1, 32] In otherds, the densities can often be appreted by mixtures with(far) lesg of irrelevant mixture components), Provided the relevafeatures of the densities are preserved, the resulting suboptimal algorithms are expected tobe close to optimal Bayesian filtering, For dealing with non-linearities ' extendedunscentedKalman filtering(EKF(2 UKF [14] )or particle filtering(PF [31 ) can be used3 ExaResourceement for a multi-functional radar certainly depends on the particulariscuss track maintenance for ground-basesurveillance while minimizing the allocation time and energy required
The track accuracyis important only insofar as stable tracks are guaranteed Track initiation or implementationenefits of immdeling and amplitudeinformation clearly visible, false detections (clutter, electronic countures), datapossiblyexcluded Neverthelesstheir impact might well be incorporated into the general Bayesian framework [16]In phased-array radar tracking, additional sensor information can be acquired when needallocation"[7 certain radar paranrevisit time th, the current beam position b, i e a unit vector pointing into the direction wheed, and the transmitted energy per duell ek Other radartime i re g the skin echo produced by the illuminated object, the med tobe constantAfterparameters(detection threshold AD, radar beam width B)are assumeallocation Rhurements of direction cosines of the object and the object range, zu
Advanced Sensor and Dynamics Models with an Application to Sensor Managemk rR, along with the signal amplitude ak, A single dwell may be insufficient fobject detection and subsequent fine localization, Let nB, denote the number of dwellsneeded for a successful detection and Bk=(bi isi the set of the corresponding beampositions, Each radar allocation is thus characterized by the tuple R=(f Bw nB, ek zu, ar)of successive allocations is denoted by R=[ j1 Radar Cross Section Fluctuations: The instantaneous radarsection g of realisticed and thetatistical models are used for describing the backscattering properties of the objectsy practical cases,p(oklo, m)=gm(ok: a(m/a)In this equation a denotes the mean Ros of the object that is usually unknown, butconstant in time and characteristic of a certain class of objects, while the parameterof"degrees of freedom" The individual samples ok are assumed to bee
g ) The cases m-1, 2 are referred to as Swerling-I and-IlI fluctuations [11]the instantaneous object signal v=(ding to thestandard modeling assumptions [11] Since the signal components are assumed to beatistically independent, the pdf of the resultingP(selva)=N(s1: 01, an)N(s2; 02, on)(51)The normalized scalar quantity ax=(sf+s2)/20n, derived from s, is thus Rice-distributedIo(2akVsnk) with snk =(uf + u2)/202the instantaneous signal-to-noise ratio of the object being proportional to the instantaneousa2gamma-distra2p(agisn)= dank p(a? snk )p(snk ISN)carried out(see[1], e g )yielding:m+SNdenotes the Laguerre polynomials For Swerling-I/Ill theseby: Lo(-x)=1, L1(x)=1+ Obviously, p(af I SN) can be interpreted as a gaxture with the expectation value Elaf]"1+SN
Sensor and Data fusion2)Mean Received Signalrevisit time f, therameterconsideredd the relebject paean RCS, object position) With a Gaussian bearproven in applications, the radar range equation(see [11], eg ), we assumeSN4=SN0()(2)()e-2h4)the actualat time ti, while dUx) denotesrelated directies With theb,6)ded) beame of relative beam positioningThe radar parameter SNo is the expected meansignal-to-noise ratio of a object with a standard mean cross section o o at a reference rangethat is directly (Ab=0)illuminated by the beam with the energy eo Due to the functionalrelationship stated in Equation 53, the signal strength apDetection andEquation 50), the detection probability PD is a function of SN and ApPoIsN
An, m)The false alarm probability Pr is analogously obtained: PF(Ap)= PD(0, Ap, m)=eIntegration results in explicit expressions for Pp [11] For Swerling -I/Ill fluctuations, wePD(SN, Ap)=e" sN=PFP(SN,AD)=c(1+3)For object tracking af is available after a dee af>AD We thus need the conditionaln and thus approximately obtainR()e-mai/SN Lm-1(-af), which is similar to the expression in53 On the other handthe detection probability for m l beimatelyPD(SN, AD, m)a PD(SN, Ap)(ie by a Swerling-I-modelrite:p(a af >\D, SN, m)Sm(ap: SN,m)with
Advanced Sensor and Dynamics Models with an Application to Sensor Managemus furthermoren after detection result in bias-freemeasurements uof the directiwith gaussianAccording to [11, the standardnd on the beam width b and theinstantaneous snu in the followingok"∝B/√snk≈B/Vathe last approximation af is used as a bias-free estimate of sn& E[apIassumed to be gausth a constant standard deviatilis model of the measurement process does not depend on the rcs fluctuation meTracking Algorithms revisitedAccording to the previous discussion, object tracking is an iterative updating scheme forconditional probability densities P(x R") that describe the current object state x given allR and the underlying astatistical models The processing of each new mBayes Rule establisheursive relation between the densities at two consecutive revisit times (a prediction stepfollowed by filterin(xR-1)=∑-1N(xk;x1,P-))denoting a particular model history, ie a sequence ofregarding the object dynamics model from a certain observation at time fk-+1 up to the mostrecent measurement at time f(n scans back")
In thef a single dynamics model() the prediction densities P(x -)are strictly given by Gaus(standard KalmanRodels used GPB2 and standard IMm alalizations of this scheme [3 For standard IMM, the approximationse after theprediction, but before the filtering step, while for GPB2 they are applied after the filtering2)Processing of Signal Stref
ormation: Let us treat the normalizedRCs of theobject, s=ok/o0 as an additional component of the state veince the signal strengthafter a detection occurred may be viewed as a measurement of Sk let us consider theugmented conditional densityP(xk, SkR")=p(sk xk, R)The calculation of P(x R)was discussed in section 2 For the remaining density p(s x,R),n application of Bayes Rule yields up to a normalizing constantp(skxk, ak,RR-)o Sm(ak: SN, m)p(sklTk, R-)(63Advanced Sensor and Dynamics Models with an Application to Sensor ManagemThe information elements required for producing a timely situation picture are proof information availabhich in themselves often have only limited value for understanding the situationEssentially within the fusionslementaritretely speaking,haracterized by a stochastic approach (estimating relevant state quantities) and aexploiting information) Besides the operational requirements, this more or less coherentethodology is the second building principle, which gives the field of sensor data andC Overview of a Generic Tracking Systemmong the fusion products, so-called tracks' are of particular imTracks represeknow ledge on relevant state quantitiesividual objects, objand formations, or even large object aggregations (e
g marchof the objects Ifn the objects/object groudescribingledge are importantents of tracks Thehievable track quality, howeot only depend on thperformance, butlso on the operational conditihe actually consideredo and the availablecontext knowledacking Fusion SystemSensor SysteTrack lnitabeneersing Hardware:Sensor SysteSensor Systemblocks withth its relation to the sensors(centralized configuration, type Iv according to Oprocess,tially working as aof data rate reduction, the signal processing provides
Sensor and Data fusionestimates of parameters characterizing the wavefreceived at the sensors front endystem, a Possibly related to objects of interest, which are the input for the tracking/fusionta thatkisting tracksfor trackmaintenance (using, eg, prediction, filtering, and retrodiction) The remaining data arerocessed for initiating new tentative tracks(multiple frame track extraction) Associationto track md initiation Track confirmation/terminationclassification /identification and fusion of tracks related to theis part of the track processing, The schepletdisplaying and interaction functions
Context information can be updated or modifinteraction or by the track processor itself, for examplbject classification or road map extraction In the case of multifunctional sensors, feedbacksitiomanagement)Modern multifunctional agile-beam radar based on phased-array technology is an excellentexample for aThis is particularly true for multiple object tracking tasks where such systems call foralgorithms that efficiently exploit their degrees of freedom, whichblerange and may be chosen individually for each track Of special interest are military airtuations where bothbjects and objects significantly differing in their radarust be takUnless properly handled, such situations can be highlybined tracking and sensor managementinnovation intervals, radar beam positioning, and transmitteted enefficiently exploiting its limitedes, the totalillance perfohasedarray radar management The starting point is the tracking system, which generatessor information based on the current qualityindividual object track or on the requirement of initiating new tracks We thus distinguishbetween track update and search requests, which enter into the priority managementhus enabling gracefulmust bech as the calculated radar revisit time and the corresponding radar beam positiorangeand Doppler-gates, or the type of the radar wave forms to be transmitted Track searcequests require the setting of apite revisit intervalghich the radarcated and radarived echo signalsdetectiontrack maintenance mode, a local search procedure is initiated, new radar parameters are set,tion is started with as small a time delay as possibleThis local search loop is repeated until either a valid detection is produced or the track is
Advanced Sensor and Dynamics Models with an Application to Sensor ManagemPriortY MaDwell ScLocal Search H=Track Update Requeck Search RequestTracking Systemnceled While a new beam position according to a global or sector search pattern iscalculated if no detectionthe track search mode, a tentative detection has to benfirmed before a new track is finally established After a successful detection, the receivedtheobject range, azimuth angle, radial velocity, and the object strength, are estimated being thehigh-gAllocation time andhus to be expected if adaptive dynamics modelsthe object dynamics are used Besides their kinematic characteristics, the mean radarection(RCS)of the objects to be tracked is usually unknown and variable over a wideng of signal amplitude information, however, the energy spent for trackbe adapted to the actual object strength
By this meashe totalDue to the locally confinedt illumination by the pencil-beam of a phasedtical since, in contrast todar, a periodic object illuminationr for other tasks (eg weapon guidance or providing communications links) Thisalls for intelligent algorithms for beam positieand local search [1724, [20] thatcrucial to phased-array radar trackingFor track-while-scan radar systems, Bayesian tracking techniqueprovide an iterative updating scheme for conditional probability densities of the object state,
Sensor and Data fusionr data and a priori information available In those applications datanal-to-noise ratio of the object (ie thethe correct positioning of the pencil-beam, which istaken into the responsibility of thetracking system Sensor control and data processing are thus closely interrelated Thisbasically local character of the tracking process constitutes the principal difference betweThe potential of this approach is thus also available for phased-array radar Thedifficult problem of global optimization, takingallocations into account, is notSensor and dynamics models in bayesian object trackinguncertaintyd methodology for dealing withMore concretely speaking,esian approach provides a processing scheme for dealinuncertain informatioparticular type), which also allows to make 'dnot be made in a particular data situation Ambiguities can have different causesproduce ambiguous data due to their limited resolution capabilities or due toer blindness(MTE: Moving Target Indicator)
Oftenhe objects environment is aof ambiguities itself (dense object situations, residuallutter, man-made noise, unwanted objects) A more indirect type of ambiguities arises fromthe objects behavior (eg qualitatively distinct maneuvering phases) Finally, the contextknowledge to be exploited can imply problem-inherent ambiguities as well, such asintersections in road maps or ambiguous tactical rules describing the over-all object behaviorgeneral multiple-objectsle-sensor tracking task, hoghlythaeyond the scope of thisntroduction) Nevertheless, in many applications, the trackinglems of (much) less complexiAccording to this discussion, we proceed along the following linesBasis: in thef timesors producebjects of interest Theted sensor data are an example of a time seriesinsistingobject position, its velocity, and accelerationObjective: Learn as much as possibleividual object states at each tirinterest by analyzing the 'time series created by the sensor dataProblem: The sensor information is inaccurate, incomplete, and possibly eApproch: Interpret sensor measurements and object state vectors as random variablesDescribe by probability density functions (pdfs)what is known about these randomSolution: Derive iteration formulae for calculating the probability density functions ofhe state variables and develop a mechanism for initiating the iteration Derive stateestimates from the pdfs along with appropriate quality mea
Advanced Sensor and Dynamics Models with an Application to Sensor Managemcular instants of time denoted bkder the set Z=zmultiple objects x is the joint stThe corresponding time series up to and including f is recursively defined by 2=lZ2) The central question of object tracking is: What can be known about the object statime instants f, ie for the past( k),bying thedata collectehe times2? According to thepreviously sketched, the ais given by the conditional probability density functions(pdf)P(x 2)to be calculated iteratively as a consequence of Bayes'rule
Forie forbject states at the current time tu, we obtain:P(xk 2Z)P(Zk, nkxk)P(xk 2sdxk P(Zk, nkxk)p(xk[Zk-)In other words, P(x 2) can be calculated from the pdfs p(x 2)and P(Z, nk xObviously, peZs m x) needs to be know n up to a constant factor only Any function(xk;2k,nk)∝xp(zkresult Functions of this typalso called likelihood finctions andescribe what can be learned from the current sensor output Z, n about the object state x afthis time This is the reason, why likelihood functions are often also called" sensor modelsince they mathematically represent thents and properties, in the datall-separated objects, perfect detection, in absence of falsend for bias-freements of linear functions Hix of the object state withhite noise meastor characterized byariance matrix R,, thefunctions are proportional to a Gaussian: E(x z Hk R)M(z H x, R)B Prediction updatethe time t based in the equation 1 is a prediction of the knowledge on the object state forthe measurements received up to and includiprevious time f-1 comes into play yieldingp(xk2k-1)=/dxk-1p(xkx-1,2-1)p(xk-12k-1)The state transition density p(x- 1 2 )is often called the"object dynamics model"andthematically represents the kinematic object properin the same way as the likelihood function represents the senso1)Gauss-Markov Dynamics: A Gauss-Markov dynamics, defined by the transition densityP(xkxk-1, ZR-)=N(*k; F:k-1Xk-1, Dkk-1)
Sensor and Data fusioneterministic part of the temporal evolution, and D: k(dynamharacterizing its stochastic part If we additionally assume that the previous posterior)=N(xk-1;xk-1k-1,Pk=1k-1)2-) is also a gauP(xk[ -)=N(*k; Xkk-1, PkIk-1)nd a covariance matrix PJk-l given byFkk-1Pk-llk-1Fkk-1+This directly results from a useful product formula for gaussialw(z: Hx, R)N(x; y, P)= N(z: Hy, S)xN(x; y+Wv, P-wSw)w(x; Q(P-x+H z),Q)v=Z-Hy, S=HPH+R, W=PHS,Q=P+HR H (10)Note that after applying this formula the integration variable x-1 in the Equation 3 is nus trivial as2)IMM Dynamics Model: Inal applicalmight bewhich dharacterized by different modes1n be handled bydynamics models with a given probability of switching between them(IMM: InteractingMultiple Models, [2], [6] and the literature cited therein) The model transition probabilitiesart of the modelingstrictly speaking, suppose that r modelsn and let j be denoting the dynamics model assumed toeffect at time tk thestatistical properties of systems with Markovian switching coefficients are summarizedthe following equationp(ak, jk rk-1, jk-1)=p(ak zk-1, jk)plkLjk-1)Sketch ofM(z: Hx R)Mxy, P as a joint density p(z, x)p(z xop(x]
Itbe written as a Gifrom which the marginal and conditional densities P(z), P(x z)cabe derived in thetake use of known formulae for thetrix(see [2, P 22], e g ) From p(z, x)=p(x z)(z) the formula results
Advanced Sensor and Dynamics Models with an Application to Sensor ManagemP(eljk-1)N(xk; Fklk-1xk-1, D/k-1)king perfoe does not seem to critically depend on the particular choietransition probabilities P(A Ljk-1), provided the number r of models involved is smallLet us assume that the previous posterior is written as a Gaussian mixture,=∑p(xk-1,J-12-)(14)xf individual Gaussians The vector index jl1 is defined by ju-1=jukl ie the mixture p(x
12-)rw components, where n is a parameter Then= 1 corresponds to the situation standardevious posterior of this type, we obtain for the prediction update:p(x2-1)=∑dxk1p(xk,jixk-1,i-1)p(xk-1,e-12-1)(16)∑∑:1N(xk;xk1,P如1)∑∑:N(x;x8)with weighting factors Phlk-1, an expector xkk-1ariance matrixP/sPl-1=P(cljk-1)P(k-112-)P
Sensor and Data fusionVia moment matching [2, P 56], the number of mixture components can be kept constant if∑燦1N(xk;x1,Pk1)≈p2-1)N(xk;xith pGjk1zk-),xkik-I, and Plik,given by內:xk:Pk∑h=1(P1+(x1一-x-1)(x1-xyielding a Gaurepresentation of p(x 2-)witmixtureC Filtering UPdate Stepto previous considerations, the conditional pdf p(xby combining the following pieces of evidence: P(x- 2 )(knowledge of thex-1)(object dynamics), t(x Z, m)(measurements,d Kalman update Formulae: In case of well-separated objects under ideal conditionse without false returns, assuming perfect detectingle dynamics model, andGaussianerrors, the well-known Kalman filtering results as a limiting case ofpproach
The Kalman filter is thus a simple straight-forwardrealization of Bayesian tracking In this idealized situation, i e withp(xkxk-1)=N(xk; FkIk-1Xk-1, Dkk-1)e(xk; zk)=N(zk: HkXk, Rkdge at each time t according toEquation 9), we obtainI(zk-HKXPkk(PKk-Xkk-1+HKR*zk)