Characteristic of an IT Engineers.

Characteristic of an IT Engineers.

  1. Passionate; loves computers and programming, takes an interest and thinks about things even outside working hours.
  2. Curious; wants to understand new things, researches unfamiliar terms.
  3. Humble; recognizes that other people are smart and have great ideas and knowledge, respects relationships more than technology.
  4. Creative; sees ways to do things that others don’t see, comes up with better ways of doing things, goes beyond.
  5. Friendly; easy to get along with, does not sabotage or bring down team morale.
  6. Fast learner; can quickly research, understand and use unfamiliar software technologies, tools and languages.
  7. Focus; works towards completion of tasks with minimal distraction, avoids taking tangents.
  8. Comprehension; can make sense of software requirements and understand what it is that needs to be built, able to grasp the “mental model” of the internal structure of a software application.
  9. Logic skills; ability to devise logical solutions for programming problems.
  10. Pragmatic; able to make a value judgment about what is really important, values practical outcomes and getting the job done, avoids gold plating.
  11. Not dogmatic; willing to change their mind and see things from the perspective of someone else, values the intellect of others. Not a jerk.
  12. Workman like; willing to do the drudge work as well as the exciting work.
  13. Thorough; puts in the 10% more needed to do a great job rather than an adequate job.
  14. Intellect; able to grasp very complex computing concepts, able to develop very sophisticated code, able to do “the hard stuff”.
  15. Energy; productive, motivated, strong work ethic, gets a lot of work done in the available working time.
  16. Practices; writes lots of code, especially in the early years.
  17. Persistence; sticks at it, takes the time needed to get something done or to learn something new.
  18. Flexible; adaptable, happy to take new directions, happy to work with new technologies, happy to try new things, happy to change priorities.
  19. Thirst for knowledge; actively self educates, reads and researches, willing to learn from others, always believes there is always much more to learn.
  20. Expert knowledge; has superb knowledge of, and has thoroughly researched the primary programming languages (typically 3 or fewer), object models and frameworks that they do most of their day to day programming with.
  21. Deep knowledge; has an in-depth understanding and experience in some small number (typically fewer than 10) programming languages and related technologies.
  22. Broad knowledge; has passing familiarity with a very wide range of programming languages and related computer technologies.
  23. Ability to write; can string words together to communicate. Client emails, co-worker emails, documentation, emails, proposals, blog posts, tweets.
  24. Knowledge of computer science fundamentals; object oriented programming, design patterns, algorithms and data structures, how computers work at a low level, hardware, operating systems, networking, databases & much more stuff.
  25. Verbal communication; able to explain their own thought process, can explain complex concepts, can participate in discussions with team members, can communicate with customers/users and other non technical people.
  26. User oriented; can empathise with users, understands where the users are coming from and what is most important to them.
  27. Software design and architecture; can design class structures, can design API’s, can design subsystems within an application, or can design entire application architectures.
  28. Quality oriented; understands software testing, writes tests for their code where appropriate, understands the concept of test driven development, meets organisational expectations for testing & quality, feels satisfied by a job well done.
  29. Balances coding priorities; knows when code should be written primarily for robustness, maintainability, reusability, speed of development, execution performance, scalability, security, polish, presentation, usability or some other factor.
  30. Problem solving; knows how to attack a problem and has the tenacity to solve even very hard problems, uses appropriate debugging tools.
  31. Development tools; understands their development tools, compiler/IDE and knows how to get the most out of them.
  32. Seeks simplicity; understands the danger in complexity, prefers simple solutions.
  33. Interested in the field; Knowledge of the industry, trends, directions, history.
  34. Avoids re-inventing the wheel; able to look at a problem, analyse it, work out what class of problems it comes from, can find patterns, libraries, algorithms, data structures or other pre-existing solutions that might fit the problem well and reduce the need to write code.
  35. Honest; can admit mistakes, unafraid to admit they don’t know something.
  36. Detail oriented; pays close attention. Avoids missing things, not sloppy or half-baked.
  37. Understands the lifecycle of software development; the roles played by developers and other people in that process.
  38. Manages own workload; able to prioritise their own tasks, willing to adapt to change.
  39. Cares about maintainability.
  40. Uses source control.
  41. Appreciates peer review; does not feel threatened or insulted by peer feedback.
  42. Groks; is able to read source code and learn what it is doing.
  43. Understands performance; able to optimise and write fast code when appropriate, knows how to avoid common performance problems.
  44. Writes clean code; readable, well formatted, appropriately commented code.
  45. Understands requirements specifications; able to make sense of software requirements, knows how to resolve questions and ambiguities, understands the relationship between requirements and testing.
  46. Follows coding standards; where there is such an expectation.
  47. Wants to be working on this project, at this company; a programmer is unlikely to do a great job if they are working on a project they don’t enjoy, or working at a company they don’t like.
  48. Strong research skills; good at ferreting out information: digging through documentation, searching the web, reading reference guides, release notes, discussion forums, mailing lists. Knows how to find answers.

 

Reference: http://www.supercoders.com.au/blog/50characteristicsofagreatsoftwaredeveloper.shtml

Characteristics of a good software

Software Quality Characteristics

While developing any kind of software product, the first question in any developer’s mind is, “What are the qualities that a good software should have ?” Well before going into technical characteristics, I would like to state the obvious expectations one has from any software. First and foremost, a software product must meet all the requirements of the customer or end-user. Also, the cost of developing and maintaining the software should be low. The development of software should be completed in the specified time-frame.

Well these were the obvious things which are expected from any project (and software development is a project in itself). Now lets take a look at Software Quality factors. These set of factors can be easily explained by Software Quality Triangle. The three characteristics of good application software are: –
1)  Operational Characteristics
2)  Transition Characteristics
3)  Revision Characteristics

Software Quality Triangle

 qtr

Software Quality Triangle with characteristics

16 Characteristics of Good Software are as below

What Operational Characteristics should software have?

These are functionality based factors and related to ‘exterior quality’ of software. Various Operational Characteristics of software are :

a) Correctness: The software which we are making should meet all the specifications stated by the customer.
b)  Usability/Learn-ability: The amount of efforts or time required to learn how to use the software should be less. This makes the software user-friendly even for IT-illiterate people.
c)  Integrity: Just like medicines have side-effects, in the same way software may have a side-effect i.e. it may affect the working of another application. But quality software should not have side effects.
d)   Reliability: The software product should not have any defects. Not only this, it shouldn’t fail while execution.
e)   Efficiency: This characteristic relates to the way software uses the available resources. The software should make effective use of the storage space and execute command as per desired timing requirements.
f)  Security: With the increase in security threats nowadays, this factor is gaining importance. The software shouldn’t have ill effects on data / hardware. Proper measures should be taken to keep data secure from external threats.
g)  Safety: The software should not be hazardous to the environment/life.

What are the Revision Characteristics of software?

These engineering based factors of the relate to ‘interior quality’ of the software like efficiency, documentation and structure. These factors should be in-build in any good software. Various Revision Characteristics of software are :-

a) Maintainability: Maintenance of the software should be easy for any kind of user.
b) Flexibility: Changes in the software should be easy to make.
c) Extensibility: It should be easy to increase the functions performed by it.
d) Scalability: It should be very easy to upgrade it for more work(or for more number of users).
e) Testability: Testing the software should be easy.
f) Modularity: Any software is said to made of units and modules which are independent of each other. These modules are then integrated to make the final software. If the software is divided into separate independent parts that can be modified, tested separately, it has high modularity.

Transition Characteristics of the software:

a) Interoperability: Interoperability is the ability of software to exchange information with other applications and make use of information transparently.
b) Reusability: If we are able to use the software code with some modifications for different purpose then we call software to be reusable.
c) Portability: The ability of software to perform same functions across all environments and platforms, demonstrate its portability.

Importance of any of these factors varies from application to application. In systems where human life is at stake, integrity and reliability factors must be given prime importance. In any business related application usability and maintainability are key factors to be considered. Always remember in Software Engineering, quality of software is everything, therefore try to deliver a product which has all these characteristics and qualities.
Reference: http://www.ianswer4u.com/2011/10/characteristics-of-good-software.html

Software testing

Software testing

Software testing is an investigation conducted to provide stakeholders with information about the quality of the product or service under test.[1]Software testing can also provide an objective, independent view of the software to allow the business to appreciate and understand the risks of software implementation.

Test techniques include, but are not limited to, the process of executing a program or application with the intent of finding software bugs.

  1. 1.       Defects and failures

Not all software defects are caused by coding errors. One common source of expensive defects is caused by requirement gaps, e.g., unrecognized requirements, that result in errors of omission by the program designer.[6] A common source of requirements gaps is non-functional requirements such as testabilityscalabilitymaintainabilityusabilityperformance, and security.

 

  1. 2.       Testing methods

2.1   Static and Dynamic Testing

There are many approaches to software testing. Reviews, walkthroughs, or inspections are referred to as static testing, whereas actually executing programmed code with a given set of test cases is referred to as dynamic testing

2.2   Black-box & White Box testing.

White-box testing (also known as clear box testing, glass box testing, transparent box testing, and structural testing) tests internal structures or workings of a program

Black-box testing treats the software as a “black box”, examining functionality without any knowledge of internal implementation. The tester is only aware of what the software is supposed to do, not how it does it.

               

3. Testing levels

3.1 Unit testing

Unit testing, also known as component testing, refers to tests that verify the functionality of a specific section of code, usually at the function level.

3.2 Integration testing

Integration testing is any type of software testing that seeks to verify the interfaces between components against a software design.

3.3 System testing

System testing tests a completely integrated system to verify that it meets its requirements

4.Testing approach

4.1 Top down & Bottom up

5. Objectives of testing

5.1 Installation testing

5.2 Compatibility testing

5.3 Smoke and sanity testing

Sanity testing determines whether it is reasonable to proceed with further testing.

Smoke testing is used to determine whether there are serious problems with a piece of software, for example as a build verification test.

5.3 Regression testing

5.4 Alpha testing

5.5 Beta Testing

A sample testing cycle

Although variations exist between organizations, there is a typical cycle for testing. The sample below is common among organizations employing the Waterfall development model.

  • Requirements analysis: Testing should begin in the requirements phase of the software development life cycle. During the design phase, testers work with developers in determining what aspects of a design are testable and with what parameters those tests work.
  • Test planningTest strategytest plantestbed creation. Since many activities will be carried out during testing, a plan is needed.
  • Test development: Test procedures, test scenariostest cases, test datasets, test scripts to use in testing software.
  • Test execution: Testers execute the software based on the plans and test documents then report any errors found to the development team.
  • Test reporting: Once testing is completed, testers generate metrics and make final reports on their test effort and whether or not the software tested is ready for release.
  • Test result analysis: Or Defect Analysis, is done by the development team usually along with the client, in order to decide what defects should be assigned, fixed, rejected (i.e. found software working properly) or deferred to be dealt with later.
  • Defect Retesting: Once a defect has been dealt with by the development team, it is retested by the testing team. AKA Resolution testing.
  • Regression testing: It is common to have a small test program built of a subset of tests, for each integration of new, modified, or fixed software, in order to ensure that the latest delivery has not ruined anything, and that the software product as a whole is still working correctly.
  • Test Closure: Once the test meets the exit criteria, the activities such as capturing the key outputs, lessons learned, results, logs, documents related to the project are archived and used as a reference for future projects.

 

 

 

 

Reference:

http://en.wikipedia.org/wiki/Software_testing

 

 

 

 

 

Defect Priority

 

Defect Priority (Bug Priority) indicates the importance or urgency of fixing a defect. Though priority may be initially set by the Software Tester, it is usually finalized by the Project/Product Manager.

Priority can be categorized into the following levels:

  • Urgent: Must be fixed in the next build.
  • High: Must be fixed in any of the upcoming builds but should be included in the release.
  • Medium: May be fixed after the release / in the next release.
  • Low: May or may not be fixed at all.

Priority is also denoted as P1 for Urgent, P2 for High and so on.

NOTE: Priority is quite a subjective decision; do not take the categorizations above as authoritative. However, at a high level, priority is determined by considering the following:

  • Business need for fixing the defect
  • Severity/Impact
  • Probability/Visibility
  • Available Resources (Developers to fix and Testers to verify the fixes)
  • Available Time (Time for fixing, verifying the fixes and performing regression tests after the verification of the fixes)

ISTQB Definition:

  • priority: The level of (business) importance assigned to an item, e.g. defect.

Defect Priority needs to be managed carefully in order to avoid product instability, especially when there is a large of number of defects

Installation Testing

Defect Severity

Defect Severity or Impact is a classification of software defect (bug) to indicate the degree of negative impact on the quality of software.

ISTQB Definition

  • severity: The degree of impact that a defect has on the development or operation of a component or system.

DEFECT SEVERITY CLASSIFICATION

The actual terminologies, and their meaning, can vary depending on people, projects, organizations, or defect tracking tools, but the following is a normally accepted classification.

  • Critical: The defect affects critical functionality or critical data. It does not have a workaround. Example: Unsuccessful installation, complete failure of a feature.
  • Major: The defect affects major functionality or major data. It has a workaround but is not obvious and is difficult. Example: A feature is not functional from one module but the task is doable if 10 complicated indirect steps are followed in another module/s.
  • Minor: The defect affects minor functionality or non-critical data. It has an easy workaround. Example: A minor feature that is not functional in one module but the same task is easily doable from another module.
  • Trivial: The defect does not affect functionality or data. It does not even need a workaround. It does not impact productivity or efficiency. It is merely an inconvenience. Example: Petty layout discrepancies, spelling/grammatical errors.

Reference : http://softwaretestingfundamentals.com/defect-severity/