Welcome to Milagrosoft, the Sunflower Site
New updates: If you haven't been to this site lately, here is a list of new (to the site) or updated docs:
 MOISE Diagrams (Beer's VSM Model) [20120604] @ 17pages. This is a simplification of Stafford Beer's Viable System Model, with some mnemonics thrown in to annotate nested rectangles.
 XMLSchema Quick Start [20110106] @ 25 pages. Enough XML Schema to get you started. Explanations of the various types are offered and examples are developed and explained
 Netbeans Notes [20110623] @ 6 pages. The comprehensive IDE, Netbeans 7.0 is production ready. This note will get you started in setup combined with Java, as well as some configuration.
 XMLSchema Quick Start [20110106] @ 25 pages. Enough XML Schema to get you started. Explanations of the various types are offered and examples are developed and explained
 XML Database Project [20101110] @ 10 pages.A micro tutorial on XML databases using the eXist native XML base. This can get you started in using it.
 Javadocs for 6.9 [20110513] @4 pages. Illustrates how to Javadoc both package level code, using a package.html file, and Javadoc multiple packages using an overview.html file.
 XQuery .[20091210]. A starter set of notes on XQuery
 AHP Basic Introduction [20090812]@11 pages.
An example of how to set up and analyze a simple AHP (Analytic Hierarchy Process) structure.
 Web Service Tutorial .[20100601] @17 pg. A hands on tutorial to build a web service and access it using a Java client> This was developed using the netbeans 6.8 IDE. (This was tested in a management oriented class)! Note: Netbeans 6.5 and 6.9 are slightly different but not too much.
 Rob's collection of Java Notes.[20110920] @8 pages. Various Java topics discussed with some code examples. Mostly as a supplement to inclass discussions.
Hello all, Rob Rucker here. This site is primarily intended to help my students learn and apply some valuable quantitative tools and ideas for use in their university course work and organizations. Learning these tools will allow you to valueadd to the usual disorganized situation you are given to analyze. By value add, I mean that these tools will allow you to organize, present, and point out to your client important features of whatever it is you are given to work with. Since you are reading this, I expect that you are doing primarily knowledge work, which puts a premium on mentally penetrating to the heart/brain of the matter. For that, you need tools, intellectual tools. That's my real goal here, offering knowledge tools applicable to wrestle with batches of numbers, or batches of words ( See Exploratory Data Analysis as well as Exploratory Discourse Analysis).
Whether student or guest though, I invite you to browse the contents of this site, and hope you find the documents interesting and helpful. If you have suggestions or comments, you can contact me at the email address at the end of this page.
In a minute I will give you links to all the site documents, but first, let me explain my specific intent here. I'm an Enterprise Informatics and Research Methods professor at several western universities, with both younger and workingadult students. During each year of teaching various courses, I meet with maybe 70 or so of our older students, and sad to say, most are pretty selfconscious and uncertain about how they can use the math, statistics, and quantitative skills they (maybe) studied once, long ago!
For my graduate students, the lack of quantitative skills is especially limiting since one university ( used to !) requires a master's thesis for graduation. A thesis means research and, I have seen any number of my graduate students give up, or not even start, promising areas of research because they were uncertain about their quantitative skills.
Encountering this situation over and over, I started writing minitutorials mainly for master's thesis students, but also for undergraduates. Mostly, I have supplemented the assigned text books in specific areas not well understood by my students. In many cases my supplements draw together and simplify several areas not covered by a single text but still directed to a necessary assignment to be done. Beyond that, I feel drawn to introduce topics not in their textbooks at all, but should be!
My agenda in all of this is to help (push!) my students to include quantitative content in their various computer oriented projects, business/work oriented projects, and master's theses.
So, I plan to do my small part in introducing or reintroducing my students to available quantitative tools that will help support their sometimes wild, qualitative conjectures and tentative arguments! Up to now, my tutorial efforts have been on an inclass basis with handouts, explanatory lectures, and exercises (too much paperwork, too many versions to keep up with). Starting with this web site, I plan to make these introductory tutorials available to all of my students, as well as interested people. All of the documents on this site ought to be considered as drafts. Each class I teach results in an update and rewrite, so check the dates shown with each doc. Sorry I can't be with each of you to explain what I meant in these docs, but the bare words and graphics may be of some help anyway.
cheers
rob rucker
Site Content
I have listed the docs on this site in rough categories. Naturally they will overlap, so your best bet is simply to read the short description and then scan the doc to see if it interests you. I have noted their upload to this site by the dates, e.g., [20080319].
Many of these documents have been though numerous draft stages but I am starting over with this site.
Thesis Notes
Here are a few papers on thinking about, and then writing a masters thesis, the original focus of this site.
 Thesis Proposal and Chapter Construction Guide .[20100415] This is my current unofficial advice to my advisees as well as to my general classes. I have included a list of some of the thesis research questions I have had some involvement with (I teach research methods and am also a thesis advisor). From these titles, you might gain some insight into what others have worked on. Keep in mind though, that no matter what others have done in a particular field, if you work passionately, diligently, and intelligently, you can make an acceptable thesis out of absolutely any topic.
 A single diagram of a proposed quantitative infusion initiative. .[20081111]. This is a (micro) thought piece on the role of quantitative tools that could be introduced into a particular course sequence or, into a whole curriculum for that matter. .
Exploratory Data Analysis (EDA) and Statistics
I am a great admirer of the late John Tukey and his introduction of the graphic approach to gaining insight into opportunities and problems. He in turn, is the inheritor of the equally brilliant graphics work of the 18th century William Playfair. Both men have given us immensely important graphical data analysis tools that are the heart of modern graphic presentations. The word Exploratory is key here since these tools are intended to suggest questions rather than to confirm them.
As an aside, I also want to draw your attention to an important contemporary worker in this field of graphical presentation for insight and wonder, Edward Tufte. I urge you to go right out and buy all of his books, they're great.
 EDA Introduction. [200806] This is a basic introduction to the starter set of Exploratory Data Analysis tools based on John Tukey's seminal text, "Exploratory Data Analysis", of 1977. My tutorial covers just some of the initial approaches that Tukey pioneered. With a view to displaying raw batches of numbers as data distributions: histograms, and stem and leaf plots, are introduced and explained here.
 Five Number Summaries and Box & Whisker Plots
[20100621] ** Complete revamp of the earlier tutorial****. This is a followon to the EDA Introduction tutorial and shows two of Tukey's inventions, the 5Number Summary and the Box and Whisker Plot. (As an aside illustrating the influence of Tukey, his Box Plots are recommended topics to be taught in the 8th grade, statewide, in Arizona). As an example of the general usefulness of this technique, at the end of this tutorial I show how constructing a box & whisker plot at each time period results in a time series. Drawing a best line through the medians of the box plots results in a trend line. See
Business Trend Analysis,
for an understandable example and explanation of all this.
 Basic Statistics Bivariate Regression[20090811] @ 8 pages. An easy introduction to 2variable regression using the vector approach, with a geometric derivation of the regression parameters and correlation coefficient.
 Business Trend Analysis .[20080923] [Updated with correlation coefficient calculations] A most basic introduction to finding trend lines through a set of data points. I start with the simplest possible example and build up the least squares analysis from multiple perspectives: graphic, geometric, algebraic formulas, and the calculus. The calculus is very lightly introduced, but I couldn't resist.( The geometric derivations could be of considerable interest since they will give you valuable insight into the spatial structure of a trend analysis. Also check out Geometric Statistics  Part I for a followon discussion of this).
 TTest Reference Data .[20090226]
A few students used Excel to do ttests but were unsure of what the output meant! So,
I used a few simple easy numbers to do hypothesis tests for a zero population mean and then the difference between two
means. I compute confidence intervals for both situations and connect my analyses with other stat package outputs.
I use geometry to calculate the tstatistics. Check this against Excel!
 Geometric Statistics  Part I .[20090403] This is a geometricbased start on showing how to think about statistics, visually.
I introduce vector spaces as the natural habitat of statistical experiments. Using
geometry, I show how to set up experiments geometrically, and how
the ttest and the Ftest are simply ratios of lengths of vectors.
I take a bit of time to explain how these tests come about and how to interpret
their outcomes. I start with an initial example of the simplest possible
experiment, using just pairs of values as my observations. I then analyze the same experiment
taking 3 and then 5 observations. This gives me a chance to demonstrate the
value of the geometric approach to statistics and how visualization
provides insight into such topics as degrees of freedom and correlation coefficients!
I have based my explanations on the excellent textbook by Saville and Wood , Geometric Statistics.
The next tutorial, Geometric Statistics  Part II, will be an extension of these geometric ideas to ANOVA ( analysis of variance) contrasting and testing multiple population means.( see geoStatThreePopulationContrasts).
 Two Population ANOVA Analysis.[20090509]
A detailed example of the geometry of a two population statistical experiment. I intend this to illustrate the geometry of testing for the difference of two population means.
 Testing various combinations of three population means..[20090416] Check out the use of contrasts to test null hypotheses. This is a very preliminary draft just to see what is useful to students nd what isn't.
 Multiple Regression Trend Analysis .[20080416] A followon to the Business Trend Analysis tutorial, I introduce the geometric approach to drawing vectors of variables in "Subject" space so that I can see their interangular relationships and so, easily calculate their ((multiple) correlation coefficients. Being able to draw variables as individual vectors provides complementary insight to drawing scatter plots. There are two important perspectives on representing data sets: variable spaces (scatter plot space) and subject spaces (pure variable vector space). I have concentrated on subject space drawings for reasons of insight. I hope you will take away some useful insights as well. (This tutorial is all written using the math package Mathematica).
 Pareto Analysis. [20080425] Pareto charts [ the 8020 rule displayed] help you quickly determine
the relative ranks of critical categories, based on a particular measure. This is a very simple, semigraphical look at how the categories intercompare. For example, you might want to rank your departments based on sales figures. The categories here are the departments and the measure of interest is their sales. The Pareto chart will show you the cumulative contributions of each department to overall sales. Or, you might be interested in
the cost per team member of various projects. Here, the categories are the projects and the measure of interest is the cost/ team member. Or perhaps the categories are customers and the measure is number of complaints registered during a certain time interval for that category of customer.
 Contingency Tables.[20100415] (Also called Cross Tabulations). I show only a very simple example of determining if two variables are independent or not. An example application could be to determine if there is gender bias in
job positions. If gender and position are independent then the contingency table will show this, otherwise, bias is suggested. This minitutorial introduces the ChiSquare distribution, but very lightly! Definitely intended just to start the conversation about independent and nonindependent variables.
 Cluster Analysis.[20090525]Introduction to cluster analysis. A very simple exploratory tool that will help you distinguish groups of entities based on their attribute values. This is the easiest of the multivariate statistical methods  enjoy! @8 pages
 Basic Finance Beta Calculation.[20090831] @ 6 pages.
A simple example of calculating and interpreting the financial metric called "beta". This
is used in CAPM to assess the volatility of a stock compared to a reference portfolio such as the S&P index. The interpretation describes correlation and regression parameters using ideas found in the related tutorial, Basic Statistics Bivariate Regression.
Basic Math & Mathematica
This is a catch all for math tools used throughout various quantitative analyses. I will also show some examples from the comprehensive Mathematica math package. Currently I am working on some tutorials exploring vector analysis with a view to using vectors as the basis for a geometric approach to statistics. If you want to see how this improves statistical education, check out the text by David Saville and G. Woods, "Geometric Statistics", it's a world beater!
 Basic Physical Statistics [20080628] This is a basic set of notes on the ideas behind calculating means, variances, standard deviations, and expected values. I use both the usual 'textbook' approach as well as physical interpretations from an engineering viewpoint. The physical interpretation shows that the mean is the 'center of gravity' and can also be interpreted as the 'center of mass'. (The gravity perspectives come about when you remember your friends sitting on a seesaw. The seesaw balances when the moments around the support axis balance). The variance is then the 'moment of inertia about that center of mass/gravity'. At the end, I give a careful definition and discussion of Random Variables and relate the earlier basic calculations to the idea of various Expected Values of a Random Variable. (This is the first draft version of these notes so its pretty rough)!
 Basic Math & Basic Physics
[20110306] @ 7 pgs. A graphical primer on the physics/math of motion.
 Basic Math of Growth Economics
[20090927]. A graphical primer on exponents and natural logs using a biological example of algae growth. Present and Future value concepts are covered in a biological context. Along the way, the constant 'e' is demonstrated to be the limit of a process involving continuous growth.
 A Mathematica Example of Conservation of Momentum and Mechanical Energy.[20090427] Just a few notes on how Mathematica could be used to explore a few physics ideas. ( If you are interested I would send you the Mathematica notebook file which is interactive, if you have Mathematica or Mathematica Player.
 Trigonometry Basics . [20080814] Nothing ground breaking here.
I wrote this up to show my students how to prove Pythagoras' theorem and then got tangled up in explaining and demonstrating various follow on formulas such as the law of cosines (the formula for the Dot product falls out of this demonstration)! I also show the
Cross product ( directed area of a parallelogram) and its geometric basis. I use the standard geometry on the unit circle to introduce all six of the trig functions. If
there is anything of value here it is probably my use of pictures to show how the
functions change as the angles change. As a very very short excursion into complex variables, I do mention how to derive a few trig identities using Euler's formula. This
interested me since all trig identities can be derived from applications of this formula (plus a few supporting theorems and axioms)! The (2dimensional) dot and cross product also come from the properties of complex multiplication).
 A Mathematica function Piecewise Water Rates.[20080319] This is a single function, using the mathematics package called Mathematica. I wrote this up in response to a student's thesis questions and ideas about water usage and provider rates in Arizona. The water usage rates are in step function form, and it was an opportunity to have some fun using my math/programming skills. This just uses practice data but does illustrate a couple of Mathematica ideas and could be extended in multiple ways.
 Vector Operations Quick Look.[20090814] @ 6 pgs.]. This is a few pages excerpted from the longer tutorial Vector Spaces and Vector Operations. My intent is to give a basis for a discussion of vectors as used within the ttest and Ftest, without going into a lot of detail. Using pictures and the dot product, I show how projection operations work within vector spaces.
 Vector Spaces and Vector Operations.[20080926] I use the results of the Trigonometry Basics tutorial within this tutorial to introduce vectors and their operations, such as their arithmetic, the Dot product, and Cross product. These ideas come up later in the geometric statistics tutorials since you will need to know some basic
characteristics of vector spaces and how to project vectors down onto vector subspaces.
General Surveys and Their Statistics
This set of tutorials results from the fact that most of my thesis students wind up doing a survey of one sort or another, or maybe interviews. The Survey Question Analysis tutorial describes how to calculate some error measures based on sample sizes, as well as the inverse task of determining the sample size needed to stay within a specified error bound. See also: Exploratory Discourse Analysis for one way to handle the argument structure as found in open ended survey questions.( Note: I suggest that survey design ought to be done prior to administering the survey)! Along these lines I have noticed a number of students cooking up a survey without much consideration of how they are going to analyze and draw conclusions from it. ( Not you of course, but some other colleague!!)
For example, within the interview and survey process, people are often asked to rank their preferences on some topic. Along with this, people are often presented with alternatives to choose from based on those preferences. How to do this when there are multiple criteria underlying their choices as well as multiple levels? To gain a bit more insight into this process as well as to be able to deal with complex multilevel decision tasks, I wrote up some work by Thomas Saaty, called the Analytic Hierarchy Process (AHP).
 Survey Question Analysis .[20080430]
Here is a rough and ready way to calculate error bands for various sample sizes
for proportional sampling (yes/no questions), and Likert (preference) scales. I have shown how to find the sample size to achieve a given error, as well as the inverse question of "what is the
error made", given a sample size.(The usual caveats apply about the sample being 'random' and representative). Just by choosing conservative values, the calculations are considerably simplified.
 Analytic Hierarchy Process .[20080430] This is a mininote on Thomas Saaty's important decision framework and associated math tools, called the Analytic Hierarchy Process (AHP). My thesis students usually confront the task of assessing peoples preferences, by survey or interview at some point during their research and are then thrown back on a simple allatonce ranking approach. Not good! Peoples preferences are often not just in the context of a single criterion but under multiple criteria and in real cases, multiple levels as well. How to do that, in a way that's defensible? Saaty's work provides one approach, and works for both quantitative as well as qualitative criteria.
 AHP Basic Introduction[20090812]@11 pages.
An example of how to set up and analyze a simple AHP (Analytic Hierarchy Process) structure. Thomas Saaty's AHP process is extremely powerful, but routinely using it is not easy. In this tutorial I am trying out a technique that I hope can be adopted for survey and interview work that puts minimal demands on the respondent. This is definitely an experimental approach and I expect to have to modify it over time, so, stay tuned!
Refer to Analytic Hierarchy Process for a detailed discussion of the philosophy behind AHP as well as the math needed.
 Binomial Distribution Tutorial.[200815] This is an
elementary but fairly
comprehensive tutorial on the Binomial Distribution. I do some examples, some calculations, and some semigraphical explanations of where this distribution comes from and how it is used. I take some time and give a plausible explanation ( to me at least) of how to calculate the binomial coefficients.
Organizational Structure and Management Cybernetics
This section shows a few ideas from my early work in organizational analysis as well as earlier work in language, especially analyzing the inherent argument structure within raw text.

MOISE & DESMIA
.[20120604] The MOISE & DESMIA tutorial is my attempt to present a tiny fraction of Stafford Beer's
Managerial Cybernetics insights. This tutorial lays out the required structure of any organization that has hopes of surviving! What I have done is to take some of Beer's diagrams, rework them a bit, and give some memory aids for their use. Check out his books for a complete and
intriguing vision into organizational modeling. (Much better than org charts)! The paper
XSLT & Cybernetics
on this site, will give a bit more detail about Cybernetics.
 Service Oriented Architecture.[20090601] @ 8 pages. A light hearted note about a current software/business perspective called the Service Oriented Enterprise. The particular acronym SOA refers to Service Oriented Architecture but is a very flexible perspective. This approach is a natural evolution/extension from OO, with all of the same caveats! SOE, service oriented enterprise, is distributed architecture with service oriented principles constructing underlying "services" that allow organizational agility. (Loads of knowledge and discipline needed to get to this state however).
 Exploratory Discourse Analysis.[20080410]
This tutorial is to help thesis students analyze and quantitatively present any arguments that can be found in the raw text that they may encounter. That text can come from anywhere but is often a consequence of responses from open ended survey questions, interviews, verbal discussions, or any other body of text for that matter. I call this Exploratory Discourse Analysis and, you will see why as you read the tutorial. I think of this approach as one 'branch' of the Exploratory Data Analysis (EDA) paradigm, only this time, applied to analyzing and diagramming English text instead of numeric's.
As an aside,
I have also included a section on "Truth Trees" , also called Semantic Tableau, that could be helpful to lay out more complex arguments.
Years ago, back in the 1980's, I got interested in logic and took a sequence of formal logic courses including propositional logic, predicate logic (FOPL), modal logic, deontic logic and others too fierce to mention! What became clear after a long time, is that these formal approaches, which involved translations into another abstract language, were ways to solve puzzles, interesting and intricate ones, but irrelevant to any practical purposes I had in mind.
Looking around for an alternative I came across Charles S. Pierce, the brilliant 19th century American philosopher and logician. Pierce realized, and took great care to explain, the limitations of formal logic. Introductory logic texts still aim to present the formal logic paradigm as a goal to strive for. Old ideas die a very slow death!
 Ray Jackendoff [ Jackendoff 1983] provides a devastating critique of predicate logic that should have put it in the human reasoning trash can long ago, but alas, it didn't. Much current effort still goes into implementing the flawed FOPL paradigm ( e.g. the Semantic Web).
My emphasis is now on what is called Natural Reasoning. I proposed a name for this approach "Engineering Discourse Analysis", in an Engineering context in the late 80's. Its' most understandable modern proponent is Stephen Naylor [Naylor, 1995] and I am indebted to him for presenting the topic in a clear manner. He understands that human beings and their personal judgments must be part of the total logical analysis loop. That's my intent here as well, to introduce what at one level is a very simple approach to argument analysis, but actually runs much deeper. What is in conflict is the idea that reasoning can be divorced from the people who do it ( the (post) positivist approach embodied in FOPL formal logic) and the natural reasoning approach that denies this.

XSLT & Cybernetics
.[200302] This is a very old unpublished paper by myself and a colleague, Eric Richardson. We were both learning the XSLT language (XML Transformation Language) at the time, as well as some basics of Cybernetics. This was our exploration of how those ideas might be tied together in the context of document management.
Java, XML, XSLT, and the usual XML Beastiary
From time to time I get to teach C, Java, XML, and JavaScript as well as HTML/CSS and the rest of the web tools. To have reference material available in one place I wrote up a few tutorials.
 CSS.[200708]. Cascading Style Sheets notes
and a few specification tables.
 HTML.[200708]. Hypertext Markup Language notes
plus a few specification tables plus my general take on this language.
 XML Rings .[201102].
Several years ago (2003), I was teaching XML at Oregon State and wanted to show my students a more integrated view of the web endeavor. I combined XML, HTML, CSS, XSLT, XPath, and the idea of an Ontology, and showed how to move between these, all in one 'ring' of technologies. Nowadays, this is all ho hum, but in 2003/2004 , putting all of these together in a course was fairly new. This doc has been laying around long enough, and, even though it is a bit dated, it might be of some interest to someone! So, have a look.
 XMLSchema Quick Start .[201011] @25 pgs. A quick entry to XMLSchema with examples, illustrations, and explanations. I have included some notes on deriving various types of elements.
 XML Database Project.[201011]@ 10 pgs. A classroom tutorial on installing and using a native XML base, eXist. Illustrations and explanations included.
 XMLSchema.[200606]. A convenient place to keep my notes about XML Schema.
 XPath & XSLT .[2005]. A convenient place to keep my notes about XPath & XSLT .
 XQuery .[20091210]. A starter set of notes on XQuery
 Java Micro Tour .[201108]. This is just an
introductory starter
view of Java that I use to introduce a couple of ideas I have had about the language over the years.
 JDBC DB Building [20110307] @ 9 pages.
Building a relational DB in Netbeans and accessing it.
 JavaDocs [20100713] @ 5 pages.
A procedure on how to insert graphics within your javadoc output (relevant files are shown). This was tested
in Netbeans 6.8. (Superceded by Netbeans 6.9.1, the ANTbuild.xml file no longer needs to be updated).
 Java Netbeans Installation
tutorial plus Javadoc interface.[20110623]
Set your Java jdk 7 and Netbeans 7.0 environment with Javadoc access.
This tutorial guides you through downloading and setting up a jdk 7
environment and then downloading and installing netbeans 7.0.
Then downloading and moving the jdk 7 api into a directory where
javadoc can find it.
Manufacturing and the Service Industry: Thoughts and Tools
This section is about ideas and approaches associated with the manufacturing
sector as well as the service industry. The notions of Operations Research , Lean Thinking, and 6Sigma are included in
this section.
Along the way I describe a few quantitative tools that
can help with various analyses. The reader may note though that this whole site
consists of various tools and the ones here could just as well been placed under any of the previous headings.
 North American Industrial Classification System.[20030815] Back in 2003, I got to teach an engineering economics class at Oregon State and introduced this inclusive North American classification system as a replacement for the old SIC (Standard Industry Classification) codes. As an incentive to study economics, I made the case that this would provide them with a 'GPS' view of their chosen industry and its relative position within the North American Industrial Complex. It was interesting to me that the classification of an industry is based on what processes are used to produce its goods and services.
 W.E. Deming and the American Management Crisis.[20080815]
This is a 'thought piece' that features W.E. Deming's ideas of what went wrong with
American management in the 60's and 70' and what remains broken even today. The title of
the book I reference here is Out of the Crisis. Published in 1986, it's still deadon.
 Linear Programming Simple Primer.[20080819]
A very easy, very quick look at Linear Programming (LP), the preeminent math algorithm of the 20th century. I show how to set up a couple of
simple resource allocation problems and demonstrate how to visually solve a 2dimensional problem. This is definitely just the barest introduction to this very valuable topic but is accessible to any reader ( I think!).
 Use Case Notes .[20091118] Introduction to Use Cases within a Service Oriented Perspective. @8 pages
 Quality Function Deployment (House of Quality) .[200704]
Here is a short note on Quality Function Deployment. QFD is a general purpose planning tool that encourages/seeks
a semiformal approximation between customer wants/needs and producer capabilities. Developed by Japanese engineers at Toyoda, it became a crucial substrate of many manufacturing initiatives both in Japan and later in the U.S. Other, related approaches include Total Quality Management, Business Process (re)Engineering, and ultimately, SixSigma. It is also intimately integrated with project planning.
I mostly think of QFD as a sophisticated, organized, semigraphical
planning matrix. Given its generic objectives of matching customer wants to production capability, it could
be mixed and matched with various other planning/analysis approaches  Strategic Choice (Planning Under Pressure), or Soft Systems Methodology come to mind. Operations Research (OR) would be a blanket description that includes all of these.
The Human Condition: Thoughts and Tools
This section is about ideas and approaches associated with various branches of
psychology and human factors. When I get to it, I would like to note down Abraham. Maslow's monumental contributions to psychology and management ideas based upon his principles. (He called this Eupsychian Management  good name). At some point I will compare his management ideas with those of W.E. Deming.
 Abraham Maslow's Needs Hierarchy.[20080819]
A few notes and quotes from this foundational psychologist. This is more in the nature of a list of some of Maslow's quotes so I can refer to one location during inclass discussions.
Reading Matter
These books and other sources are ones I have been impressed with, so you might like them too. Most of the tutorials have a reference component so these aren't really necessary, but are general good reads.
 Beer, Stafford (1985) Diagnosing the Organization  very short, very concise, very useful introduction to Beer's Managerial Cybernetics. He teaches Cybernetics via diagrams!
 Beer, Stafford (1981) The Heart of Enterprise  Managerial Cybernetics developed with no math, just diagrams and interesting text
 Beer, Stafford (1979) The Brain of the Firm  Managerial Cybernetics developed from a neuroscience perspective, diagrams, some math, and always interesting text.
 Jackendoff, Ray (1983) Conceptual Structures  Lays out how humans reason, and it's not remotely close to predicate logic.
 Maslow, Abraham  anything by this man is worth poring over.
 Thomas, Stephen Naylor, (1997) Practical Reasoning in Natural Language. Breakthrough text in raw English analysis of arguments.
 Tufte, Edward, (1983) Visual Display of Quantitative Data ,Graphics Press.
A master of graphics design and exposition of graphics principles, with lots of great examples of what to do and what not to do.
Any comments or suggestions for improvement? Send me an email at robr@fastq.com