GDS Systems Engineering Training Programs. Online Training. Training helps reduce your design and operational risks. We provide MIL-STD-810H, RTCA-DO-160, Vibration and Shock, FAA Requirements Management courses. by Dr Ismail Cicek and a CVE certified by EASA. Tailoring of the MIL-STD-810H test methods and procedures. EUT. Equipment Under Test. Online Classes. US based intructor. US DOD. EASA. FAA. NASA. Miliary Stanrdards. Askeri Test Standartları. Çevresel Test Standart Eğitimi. Eğitim. Acceleration Testing. Aircraft Systems. RTCA-DO-160. Crash Hazard. Korozyon Testleri. Corrosion Tests. Environmental Testing of Products, provided by GDS Engineering R&D, Systems Engineering Products and Solutions. Dr. Ismail Cicek. Product Verification and Validation Courses for Integrated Systems. C-17 Military Aicraft. FAA/EASA. US DoD. Safety First. US Army. US Air Force and US Navy Tailoring Examples for Mission and Environmental Profile. Setting Test Limits and Durations are Explained. How to evaluate test results and mitigate the risk (Risk Assessment Matrix). Aircafft Equipment, Devices, Plugs, Machinary, Engines, Compressors, or Carry-on. European CE Time Schedule. DOT/FAA/AR-08/32. Requirements Engineering Management Handbook. U.S. Department of Transportation Federal Aviation Administration.

Requirements Engineering Management Training (FAA/EASA/DOD/NASA)

Scope and Purpose of the Requirements Engineering Management Training

This two and a half days of training presents with a set of recommended practices on how to collect, write, validate, and organize requirements. It attempts to bring together the best ideas from several approaches, organize them into a coherent whole, and illustrate them with concrete examples that make their benefits clear. The presentations use some information from the FAA Requirements Engineering Handbook (DOT/FAA/AR-08/32), however, it is not limited to this handbook only.

Mitigate your risks before they actually happen MIL-STD-810H Training STD-461 RTCA-DO-160G (1)

The literature on requirements engineering is vast, and practices vary widely even within a particular industry. For this reason, the Handbook is targeted to the domain of real-time, embedded systems and specifically to the avionics industry. Due to the rapidly growing importance of software in these systems, it emphasizes practices that ease the transition from system to software requirements.

The main-level recommended practices are presented roughly in the order they would be performed on a program, but there is no requirement that this order must be strictly adhered to.

As with most processes, significant iteration between the different activities is expected as the requirements are refined. Rather than trying to specify a detailed process, the Handbook focuses on identifying what information is needed in a requirements specification and providing recommendations on how that information can be collected and organized.

About FAA Requirements Management Engineering Handbook

Requirements Engineering Management Handbook, DOT/FAA/AR-08/32, June 2009. This document is available to the U.S. public through the National Technical Information Service (NTIS), Springfield, Virginia 22161. U.S. Department of Transportation Federal Aviation Administration. This report is available at the Federal Aviation Administration William J. Hughes Technical Center’s Full-Text Technical Reports page: actlibrary.tc.faa.gov in Adobe Acrobat portable document format (PDF).

The FAA Handbook is targeted to the domain of real-time, embedded systems and specifically to the avionics industry. It describes a set of recommended practices in which basic concepts can be practiced in isolation, but reinforce each other when practiced as a whole. These practices allow developers to progress from an initial, high-level overview of a system to a detailed description of its behavioral and performance requirements. Due to the growing importance of software in avionics systems, these practices emphasize techniques to ease the transition from system to software requirements.

Concrete examples are used throughout the Handbook to make the concepts clear, but there are many other formats that could be used to obtain the same objectives. It is expected that most organizations wanting to use these practices will want to modify them, perhaps significantly, to integrate them with their existing processes and tools.

Training is targeted to the domain of real-time, embedded systems and specifically to the avionics industry. It describes a set of recommended practices in which basic concepts can be practiced in isolation, but reinforce each other when practiced as a whole. These practices allow developers to progress from an initial, high-level overview of a system to a detailed description of its behavioral and performance requirements. Due to the growing importance of software in avionics systems, these practices emphasize techniques to ease the transition from system to software requirements.

Training Contents

The training contents are syncronized with the contents of the FAA Requirements Engineering Management Handbook, which is as follows:

1. INTRODUCTION
1.1 Purpose
1.2 Background

2. RECOMMENDED PRACTICES

2.1 Developing the System Overview
2.1.1 Develop System Overview Early
2.1.2 Provide System Synopsis
2.1.3 Identify System Contexts
2.1.4 Use Context Diagrams
2.1.5 Describe External Entities
2.1.6 Capture Preliminary System Goals
2.1.7 Maintain System Goal Information

2.2 Identify the System Boundary
2.2.1 Identify the System Boundary Early
2.2.2 Choose Environmental Variables
2.2.3 Choose Controlled Variables
2.2.4 Choose Monitored Variables
2.2.5 Ensure Environmental Variables are Sufficiently Abstract
2.2.6 Avoid Presentation Details in Environmental Variables
2.2.7 Define All Physical Interfaces

2.3 Develop the Operational Concepts
2.3.1 Document Sunny Day System Behavior
2.3.2 Include How the System is Used in its Operating Environment
2.3.3 Employ the Use Case Goal as its Title
2.3.4 Trace Each Use Case to System Goals
2.3.5 Identify Primary Actor, Preconditions, and Postconditions
2.3.6 Ensure Each Use Case Describes a Dialogue
2.3.7 Link Use Case Steps to System Functions
2.3.8 Consolidate Repeated Actions Into a Single Use Case
2.3.9 Describe Exceptional Situations as Exception Cases
2.3.10 Describe Alternate Ways to Satisfy Postconditions as Alternate Courses
2.3.11 Use Names of External Entities or Environmental Variables
2.3.12 Avoid Operator Interface Details
2.3.13 Update the System Boundary
2.3.14 Assemble a Preliminary Set of System Functions

2.4 Identify the Environmental Assumptions
2.4.1 Define the Type, Range, Precision, and Units
2.4.2 Provide Rationale for the Assumptions
2.4.3 Organize Assumptions Constraining a Single Entity
2.4.4 Organize Assumptions Constraining Several Entities
2.4.5 Define a Status Attribute for Each Monitored Variable
2.4.6 Summary

2.5 Develop the Functional Architecture
2.5.1 Organize System Functions Into Related Groups
2.5.2 Use Data Flow Diagrams to Depict System Functions
2.5.3 Minimize Dependencies Between Functions
2.5.4 Define Internal Variables
2.5.5 Nest Functions and Data Dependencies for Large Specifications
2.5.6 Provide High-Level Requirements That are Really High Level
2.5.7 Do Not Incorporate Rationale Into the Requirements

2.6 Revise the Architecture to Meet Implementation Constraints
2.6.1 Modify the Architecture to Meet Implementation Constraints
2.6.2 Keep Final System Architecture Close to Ideal Functional Architecture
2.6.3 Revise the System Overview
2.6.4 Revise the Operational Concepts
2.6.5 Develop Exception Cases
2.6.6 Link Exception Cases to Use Cases
2.6.7 Revise the System Boundary
2.6.8 Document Changes to Environmental Assumptions
2.6.9 Revise Dependency Diagrams
2.6.10 Revise High-Level Requirements

2.7 Identify the System Modes
2.7.1 Identify Major System Modes
2.7.2 Define How System Transitions Between Modes
2.7.3 Introduce Modes for Externally Visible Discontinuities

2.8 Develop the Detailed Behavior and Performance Requirements
2.8.1 Specify the Behavior of Each Controlled Variable
2.8.2 Specify the Requirement as a Condition and an Assigned Value
2.8.3 Ensure That Detailed Requirements are Complete
2.8.4 Ensure That Detailed Requirements are Consistent
2.8.5 Ensure That Detailed Requirements are not Duplicated
2.8.6 Organize the Requirements
2.8.7 Define Acceptable Latency for Each Controlled Variable
2.8.8 Define Acceptable Tolerance for Each Controlled Variable
2.8.9 Do Not Define Latency and Tolerance for Internal Variables
2.8.10 Alternative Ways to Specify Requirements

2.9 Define the Software Requirements
2.9.1 Specify the Input Variables
2.9.2 Specify the Accuracy of Each Input Variable
2.9.3 Specify the Latency of Each Input Variable
2.9.4 Specify IN’ for Each Monitored Variable
2.9.5 Specify the Status of Each Monitored Variable
2.9.6 Flag Design Decisions as Derived Requirements
2.9.7 Specify the Output Variables
2.9.8 Specify the Latency of Each Output Variable
2.9.9 Specify the Accuracy of Each Output Variable
2.9.10 Specify OUT’ for Each Controlled Variable
2.9.11 Confirm Overall Latency and Accuracy

2.10 Allocate System Requirements to Subsystems
2.10.1 Identify Subsystem Functions
2.10.2 Duplicate Overlapping System to Subsystem Functions
2.10.3 Develop a System Overview for Each Subsystem
2.10.4 Identify the Subsystem Monitored and Controlled Variables
2.10.5 Create New Monitored and Controlled Variables
2.10.6 Specify the Subsystem Operational Concepts
2.10.7 Identify Subsystem Environmental Assumptions Shared With Parent System
2.10.8 Identify Environmental Assumptions of the New Monitored and Controlled Variables
2.10.9 Complete the Subsystem Requirements Specification
2.10.10 Ensure Latencies and Tolerances are Consistent

2.11 Provide Rationale
2.11.1 Provide Rationale to Explain why a Requirement Exists
2.11.2 Avoid Specifying Requirements in the Rationale
2.11.3 Provide Rationale When the Reason a Requirement is not Obvious
2.11.4 Provide Rationale for Environmental Assumptions
2.11.5 Provide Rationale for Values and Ranges
2.11.6 Keep Rationale Short and Relevant
2.11.7 Capture Rationale as Soon as Possible

3. SUMMARY

4. REFERENCES

APPENDICES
A—Isolette Thermostat Example
B—Flight Control System Example
C—Flight Guidance System Example
D—Autopilot Example

Training Schedule and Execution Method
  • Online training using ZOOMLed by a live, U.S. based instructor (Dr Ismail Cicek)
  • 2.5 days of training
    • 1st Day: 09:00 – 17:00 (Lunch Break between 12:30 and 13:30)
    • 2nd Day: 09:00 – 17:00 (Lunch Break between 12:30 and 13:30)
    • 3rd Day: 09:00 – 13:00
    • Time zone: Central European Time (CET)
  • Registration includes all presentations and additional material shared before the class.
  • Attendees will receive a Training Certificate.
  • Training includes knowledge check quizzes, a competition type fun way or learning.

GDS Systems Engineering V&V Training Courses
Event Calendar

We announce upcoming training on these pages. Due to COVID-19 pandemic situation, we offer only ONLINE training courses for the time being. Please communicate with us if you need a group training, which could be scheduled based on your plans and schedules.

Select the best training from below list that fits to your training needs.

Upcoming Events


About the Instructors

Training is provided by Dr Ismail Cicek and an Avionics Chief Engineer who is also a Certified Verification Engineer (FAA/EASA). Training is also assisted by our personnel experienced in MIL-STD-810H testing.

A Certified Verification Engineer (CVE) iaw FAA/EASA and with 18 years of experience. He has worked as the avionics systems chief engineer in product development of avionics systems. He is also experienced in the product testing per environmental and EMI/EMC standards and FAA/EASA certification processes.

Our experienced personnel also support our training programs. They are actively participating in the environmental testing of products.

Dr. Ismail Cicek studied PhD in Mechanical Engineering Department at Texas Tech University in Texas, USA. He study included random vibration. He has both industrial and academic experience for over 30 years.

He gained engineering and leadership experience by working in the United States Department of Defence projects and programs as systems development engineer for 15 years. He led the development of various engineering systems for platforms including C-5, C-17, KC-10, KC-135, and C-130 E/H/J.  Dr. Cicek’s experience includes unmanned aerial vehicle development where he utilized the Geographical Information Systems (GIS) and Malfunction Data Recorder Analysis Recorder System (MADARS) development for military transport aircraft. 

Dr Cicek worked as the lab chief engineer for five years at the US Air Force Aeromedical Test Lab at WPAFB, OH. He received many important awards at the positions he served, due to the excellent team-work and his detail oriented and energetic personality.  These included Terra Health’s Superior Client Award in 2009 and Engineering Excellence Award in 2010 as well as an appreciation letter from the US Air Force Aeronautical Systems Center (ASC), signed by the commander in charge.

Dr Cicek also established a test lab, called Marine Equipment Test Center (METC) and located at Istanbul Technical University, Tuzla Campus, for testing of equipment per military and civilian standards, such as RTCA-DO-160. Providing engineering, consultancy, and training services to many companies and organizations, Dr. Cicek has gained a great insight into the tailoring of standard test methods in accordance with military standards, guides, and handbooks as well as Life Cycle Environmental Profile LCEP) developed for the equipment under test.

Dr. Cicek also completed various product and research projects, funded in the USA, EU, and Turkey. He is currently teaching at Istanbul Technical University Maritime Faculty, Tuzla/Istanbul. He is the founding manager of the METC in Tuzla Campus of ITU. Meanwhile, he provided engineering services, consultancies, and training to many organizations for product development, engineering research studies such a algorith development, test requirements development, and test plans and executions.

Dr Cicek worked as the Principle Investigator and became a Subject Matter Expert (SME) at the US Air Force Aeromedical Test Lab (WPAFB/OH) for certifying the products to the US Air Force Platform Requirements. He also developed Joint Enroute Care Equipment Test Standard (JECETS) in close work with US Army Test Lab engineers and managers.

Read DAU Paper: “A New Process for the Acceleration Test and Evaluation of Aeromedical Equipment for U.S. Air Force Safe-To-Fly Certification”. Click to display this report.

Visit the following pages for more details about Dr Ismail Cicek:
Linkedin Page

Click here to read more about our training courses.

GLOBAL DYNAMIC SYSTEMS (GDS)
TRAINING COURSES
Worldwide, Online, for ‘Groups’ or ‘Individuals’

Training on
MIL-STD-810H
ENVIRONMENTAL TESTING

Training on
EMI/EMC Testing
(per RTCA-DO-160 & MIL-STD-461)

Training on
Systems Engineering
(DoD/FAA/NASA/EASA)

Training on
Vibration and Shock
Testing

Training on
RTCA-DO-160G
ENVIRONMENTAL TESTING

Training on
MIL-STD-461G EMI/EMC Testing
(incl. MIL-STD-464)

Training on
Requirements Management
(FAA/EASA/US DoD/NASA)

Training on
MIL-STD-704F
Aircraft Electrical Interface


OUR REFERENCES

We have provided training courses to more than 100 companies and organizations
and over 500 individual trainees so far.

ARMERKOM Logo - MIL-STD-810 Training Provided by GDS Engineering, NAVY, Donanma, Egitim Test Çevresel TestlerMIL-STD-810 Egitimi DzKK LogoRaytheon, in Dallas TX, Provided Product Verification and Validation PhD Course in 2009 by Dr Ismail Cicek as part of the Texas Tech & Raytheon PhD Study on Systems EngineeringUSAF_-_Aeronautical_Systems_Center Acceleration Test MIL-STD-810 Consultancy
GDS Engineering R&D has been providing systems engineering training courses (such as MIL-STD-810 and RTCA-DO-160) since 2009!
TUBITAK BILGEM Logo - MIL-STD-810 Training Egitim Env Test