Simulator for image and video data processing pipeline
Détails de l'offre
Airbus Defence & Space

09 Octobre 2018 | Aérospatial | Airbus Defence & Space

Simulator for image and video data processing pipeline

Salut, je te recommande la lecture de cette offre : http://www.ingenieurs.com/stage/simulator-for-image-and-video-data-processing-pipeline-2617.php sur digiSchool ingenieurs.
Simulator for image and video data processing pipeline

Type de contrat

Stage 6 mois

Spécialité

Aérospatial

Présentation de l'entreprise

The Space System business line of Airbus Defence & Space is the European leader in the field of optical Earth Observation systems. The company, through is history, is a pioneer of space industry, responsible for the development of the first Earth Observation space systems in Europe, starting with the SPOT family. Since this time, the company has led the major European developments in the fields, through programs such as METOP, ERS, ENVISAT, HELIOS, PLEIADES or SPOT6. This experience developed is now applied on export turn-key programs such as FORMOSAT, THEOS, ALSAT, CHILI, KazEOSat-1 or PeruSat, involving up to sub metric resolution systems, or such as COMS, a geostationary meteorological satellite for Korea. This evolution conveyed Airbus Defence & Space to develop a strong expertise in Image Quality, Image Processing and Image Simulation through a group of about 80 engineers in 2017, constituting the Image Chain department (TESUI). The Image team carries out activities in fundamenta

Missions

The Sensor Processing Chain department designs and develops image processing chains for earth observation satellites. The volume of data produced by new generation satellites is dramatically increasing. To deal with this amount of data, our processing chains are based on latest Cloud and Big Data technologies, such as Apache Spark, Flink and Kafka. Within this context, we recently designed a set of tools that simulate data processing pipelines, for both image processing and video processing. They provide metrics about performance (throughput, latency) and resource consumption of the processing chains. These tools are developed in Java and Python, and rely on the same big data frameworks as the processing chains. This training is about improving these simulation tools, such as: - adding new operators to be able to express more complex pipelines, - improve data metrics and visualization tools - integrate these tools in our product line, to improve our ability to maintain our process

Profil recherché

- Linux operating systems (intermediate/advanced) - Java and Python programming (intermediate/advanced) - Big data framework, functional programming (intermediate) - Autonomous and willing to communicate with several interlocutors to gather information - Interested in massive data processing - Cloud computing Appreciated knowledge, but not mandatory: - C++ programming - modelling frameworks and tools such as Eclipse Modeling Framework and Jetbrains MPS Desired education - Engineering school or Master in software development - Engineering school or Master, with specialisation in signal and image processing, or applied mathematics.

Date de début

04/02/2019

Rémunération

-

Lieu de travail

TOULOUSE (31400)

Postuler

Cette offre n'est plus à pouvoire.

Toutes les offres de stage
Moteur de formation
Zoom ecole