- Johnson (Bolin, ed.), "A False Sense of Proprietary,"
The Standards Edge: Dynamic Tension, Chapter 17, Sheridan Books,
2004 (pdf)
Organizations often resist standards in order
to protect their proprietary information and processes. Protecting
proprietary information is essential. However, enterprises often have a
misplaced idea of what information is truly proprietary to their business. The
protection of information has cost associated with it. While it is important to
protect intellectual property, it is also important not to protect that which
is part of the broad state of the art. Furthermore, some information increases
dramatically in value when shared in a collaborative environment. This article
explores the concept of a misplaced sense of what is proprietary, the business
case for separating proprietary and nonproprietary information, and the value
of standardizing the latter.
- Johnson, "Minimizing Systems
Integration Expenditure in a Changing World," iViP-RoaM Workshop, Torino
Italy, December 2002. (pdf)
Keynote Presentation delivered to the iViP-RoaM
Workshop. Addresses the application of standards and OMG-MDA to preserve
investment in business process applications in a changing execution
infrastructure
- Blakely, Johnson, Koko, Amador, and Fairfull,
"Integrating CAE and PDM A First Step Towards Providing
Simulation Data Management," Volume 2, Number 2, Spring 2001. (pdf)
[Excerpt from article's introduction]
Simulation is the act of predicting product performance
prior to a products manufacture. Most often, analysts, those doing CAE
(Computer-Aided Engineering), perform simulation, and in too many cases only as
an adjunct not as an integral part of the product development
process. Analysts often consider CAE outside the process, because
of the difficulty and time involved in obtaining accurate inputs.
PDM (Product Data Management) would seem to be a natural
complement to CAE, by providing a repository of data that is accessed by CAE.
Historically, though, that has not been the case. The CAE community generally
still views incorporation of PDM as an added burden, with benefits yet unknown.
Despite the issues inherent in integrating CAE and PDM, a
subset of Simulation Data Management, the benefits include enabling
the access, verification, and control of CAE input and output data. Simulation
Data Management is the next frontier in making simulation such an integral part
of the product development process that it will lead and dictate the overall
design (form follows function).
- Johnson, "Aspects of
Interoperability OMG, STEP & XML as Examples", D. H.
Brown Conference, October 1999. (pdf,
ppt)
The presentation addresses aspects of interoperability
technologies such as OMG, STEP and XML using the vehicle of ISO's RM/ODP
(Reference Model / On-line Distributed Processing)
- Johnson, "Interoperability
Examples from MSCs Architectural Directions", D. H.
Brown Conference, November 1998. (pdf,
ppt)
A concept of "Architectural Aspects"
is discussed with a short treatment on interoperability for various aspects. A
"Blend and Build" approach to evolving a systems architecture is
introduced that enables the migration of legacy systems over time.
- Johnson, "Beyond 'Form &
Fit', the Management of 'Function' ", D. H. Brown Conference,
November 1998. (pdf,
ppt)
Traditional Product Information Management (PIM) focuses on
"Form" and "Fit". Vast amounts of data are generated and
consumed in the course of analyzing products. These data allow us to manage the
"Function" of the product as well as its "Form and Fit".
Effective management of this data can allow analysis to precede detailed
design, minimizing the false steps of designers and managing trade-off studies.
- Johnson and Lawson, "We Don't Need
No Stinkin' PDM", D. H. Brown Product Information Management
Symposium Proceedings, October, 1996. (pdf,
doc)
This paper contains two different perspectives that arrive
at the same conclusion. "PDM" is an outmoded term and is actually
counterproductive. One perspective is oriented toward simply communicating more
concretely, and not forgetting what we are about as IT providers. The other
perspective is oriented toward a better way to design and build better
solutions, leveraging modern technology. Over time, some people have included
such a broad range of applications under the banner of "PDM" that the
term has lost whatever meaning it once had. In stating, "We don't need no
stinkin' PDM!", we mean the term and the mind set it tends to encourage,
as well as the confusion it creates for the users and IT providers. Existing
and emerging technologies are examined in the context of designing and
providing IT Infrastructure services in the terminology of the business.
- Johnson, "Infrastructure Services
Architecture, a Reference Architecture for the Rapid Response Manufacturing
Consortium Engineering Environments", RRM White Paper, June 1994.
(pdf)
An
Infrastructure Services Architecture is presented for the Enterprise
Integration Architecture framework of the Profile for Enterprise Integration.
The architecture enables flexible reengineering of the business process, and
the construction of product-centered virtual enterprises. The architecture
accommodates distributed, heterogeneous systems, providing protocols of service
invocation and cooperation. Emphasis is placed on following the current trends
and directions and industry at large, and on taking advantage of current and
emerging commercial-off-the-shelf software.
This paper was based in a great part on the "Profile
for Enterprise Integration", CALS Industry Working Group, June 1994
(below).
- Judson, Johnson, et al., "Profile for
Enterprise Integration", CALS Industry Working Group, June
1994.
Abstract Coming Soon
- Johnson, Judson, et al.,
"Infrastructure Services Architecture of the Profile for Enterprise
Integration", CALS Industry Working Group, June 1994.
An Infrastructure Services Architecture is
presented for the Enterprise Integration Architecture framework of the Profile
for Enterprise Integration. The architecture enables flexible re-engineering of
the business process, and the construction of product-centered virtual
enterprises. The architecture accommodates distributed, heterogeneous systems,
providing protocols of service invocation and cooperation. Emphasis is placed
on following the current trends and directions and industry at large, and on
taking advantage of current and emerging commercial-off-the-shelf software.
- Johnson, "Concept of Operation and
Context of Reference Architecture in the Rapid Response Manufacturing
Consortium", RRM White Paper, April, 1993.
The Rapid Response Manufacturing Consortium was formed to
develop, implement, and validate technologies required to fundamentally change
the development and planning of mechanical components from a serial process to
an integrated set of concurrent processes. To facilitate this goal a reference
architecture will be developed to guide the evolution of the systems of member
companies toward a common target architecture tailored for each company's
specific needs. This paper documents the context and concept of operation of
the reference architecture, per se, within the program, and introduces
candidate technical approaches.
- Johnson, "An Infrastructure Services
Architecture for Product Data Management", Texas Instruments White
Paper, June, 1993.
Abstract Unavailable
- Johnson, "Automated Distributed
Computer Management", Proceedings of the Digital Equipment
Computer Users Society, Fall 1986. (.pdf Format:
Paper,
Presentation)
A data flow model of a distributed,
loosely-coupled, time-sharing computer network is discussed. This model has
been used to initiate an evolving implementation of an automated software
distribution/installation and configuration management system capable of
handling software distribution and management for a system involving many
machines. The model defines a generic software product, independent of its
function, in the context computer services shop.
- Johnson, "VMS Rundown Interceptor, a
Robust Logout Driver Activator", Proceedings of the Digital
Equipment Computer Users Society, Spring 1986. (.pdf)
We have constructed a dataflow model of a generic software
product in the context of a general tine-sharing environment. In this model,
points of control for each product are defined for each
"system-event". One of those events is the deletion of a process
(specifically, LOGOUT). The current implementation of the activation of our
logout driver (which calls per-product logout functions) is weak. An
unprivileged user can exit the system in a manner which will bypass the
activation of the logout driver. By the definition of our model, logout
operations arc not optional. A weak activation of the logout driver is
intolerable. This study was undertaken to determine the feasibility of
implementing a robust logout driver activation.
- Johnson and Roberts, "A
Semi-Classical Approach to Collision Induced Dissociation",
Chemical Physics Letters, December, 1970. (.pdf)
Transition probabilities for the collision induced
dissociation reaction, A + BC -> A + B + C, are calculated for a harmonic
oscillator model using a semiclassical approximation. One of the unique
features of this investigation is the appearance of quantum oscillations in the
dissociation probability as a function of collision energy. The number of
oscillations is correlated to the number of bound states of the BC molecule.
|