History of Computational Group Theory to 1993


In 1993 the Groups St Andrews conference was held in Galway, Ireland. Joachim Neubüser was an invited speaker and gave the talk An invitation to computational group theory. One section of his talk involved the history of Computational Group Theory (CGT) and we give a version of this below. We also give a version of his final section on 'Concerns'.

1. Some history of Computational Group Theory

Prehistory (before 1953).

Not only did group theory get started by the very computational problem of the solvability of quintics by radicals but also (hand-) computation of groups from Mathieu's discovery of the first sporadics to Hölder's determination of the groups of order pqr made up a good deal of last century's group theory. However Dehn's formulation of word, conjugacy, and isomorphism problem for finitely presented groups in 1911 [19] (even though it had precursors) may be thought of as the beginning of the prehistory proper of CGT, focussing attention on the request for group theoretical algorithms. Two points are worth noting:

- The challenge came from topology, not from inside group theory, and indeed even now people using groups outside group theory are strong "customers" of CGT.

- Novikov's proof of the algorithmic unsolvability of the word problem put an end to the hope that group theory could fulfil Dehn's request. Soon after came proofs of the non-existence of algorithms that could decide if a finitely presented group is trivial, finite, abelian, etc. (see G Baumslag's book [3] for a vivid description).

Nevertheless in 1936 J A Todd and H S M Coxeter [64] provided at least a systematic method of attempt for showing finiteness by "coset enumeration" and today CGT provides a whole bunch of methods for the investigation of finitely presented groups that in frequent use have proved quite powerful, even though they are not decision algorithms.

Of the period predating the use of real computers I want to mention three more events :

- In 1948 H Zassenhaus described an algorithm for the classification of space groups [65] that much later has been put to very practical use.

- In 1951 M H A Newman in a talk at the 'Manchester University Computer Inaugural Conference' [47] proposed to use probabilistic methods for getting some insight into the vast number of groups of order 256. The title of the talk: "The influence of automatic computers on mathematical methods" is remarkable for that time, and the fact that the (56 092) groups of that order were only determined in 1989 by E A O'Brien [54] should be noted.

- Perhaps most notable because of its foresight however is a quotation from a proposal of A Turing in 1945 to build an electronic computer: "There will positively be no internal alterations to be made even if we wish suddenly to switch from calculating the energy levels of the neon atom to the enumeration of groups of order 720" (see [26], page 293).

Early history (1953-1967)

This may be thought of as starting with the first implementations of computing methods for groups. The first, about 1953, that I am aware of, are a partial implementation of the Todd-Coxeter method by B Haselgrove on the EDSAC II in Cambridge (see [35]) and of methods for the calculation of characters of symmetric groups by S Comet on the BARK computer in Stockholm [17]. Other areas of group theory were tried soon after. E T Parker and P J Nicolai made an - unsuccessful - search for analogues of the Mathieu groups [56], 1959 programs for calculating the subgroup lattice of permutation groups [44] and a little later for polycyclicly presented 2-groups were written. Other methods and special investigations followed, programs of this time were written mostly in machine code, use of all kinds of trickery to save storage space and (although not quite as urgently) computing time was crucial. The end of this period, in which CGT started to unfold but had hardly contributed results to group theory that would greatly impress group theorists, is roughly marked by the Oxford conference "Computational Problems in Abstract Algebra" in 1967 [36]. Its proceedings contain a survey of what had been tried until then [45] but also some papers that lead into the ...

Decade of discoveries (1967- 1977).

At the Oxford conference some of those computational methods were presented for the first time that are now, in some cases varied and improved, work horses of CGT systems: Sims' methods for handling big permutation groups [59], the Knuth-Bendix method for attempting to construct a rewrite system from a presentation [32], variations of the Todd-Coxeter method for the determination of presentations of subgroups [42]. Others, like J D Dixon's method for the determination of the character table [20], the pp-Nilpotent-Quotient method of I D Macdonald [41] and the Reidemeister-Schreier method of G Havas [23] for subgroup presentations were published within a few years from that conference.

However at least equally important for making group theorists aware of CGT were a number of applications of computational methods. I mention three of them: The proof of the existence of Lyons' sporadic simple group by C C Sims in 1973, using his permutation group methods [60], the determination of the Burnside group B(4,4)B(4, 4) of order 24222^{422} by M F Newman and G Havas using an extension of the original pp-Nilpotent-Quotient method [49], and the determination of the (4783 isomorphism classes of) space groups of 4-dimensional space [4], using not only Zassenhaus's algorithm but also the possibility to find all subgroups of the maximal finite subgroups of GL(4,Z)GL(4, \mathbb{Z}) using the programs for the determination of subgroup lattices.

This progress encouraged attempts to design CGT systems in which various methods could be used without having to translate data from one program to the other. By 1974 a first "Aachen - Sydney Group System" was operational [11], and in 1976 John Cannon published "A Draft Description of a Group Theory Language Cayley" [12], which may be thought of as a turn to ...

Modern Times.

The claim that since about 1977 the development of CGT is speeding up rapidly can be justified by looking at four aspects: results, methods, systems and publicity.

Results.

Concrete computational results, of which again I list only some: Sims proved the existence of the Babymonster of order 4 154 781 481 226 426 191 177 580 544 000 000 as a permutation group of degree 13 571 955 000 using very special implementations based on his permutation group techniques [61], the existence of Janko's J4J_{4} was proved using computational techniques for modular representations [51]. pp-Nilpotent Quotient techniques are now strong enough to show e.g. that the class 18 factor of the restricted Burnside group B(2,7)B(2, 7) has order 763667^{6366} [25]. Following a proposal of M F Newman [50], E A O'Brien implemented a pp-group generation program sufficient to determine the (58760 isomorphism classes of) groups of order 2n,n82^{n}, n ≤ 8 [54]. Making and checking the Cambridge Atlas of Finite Groups [14], probably the most widely used group theoretical table, involved a great deal of implementation and use of group theoretical programs (in particular for working with characters), and the same holds for books listing Brauer trees of sporadics [24] or perfect groups [27] as well as for other listings, e.g. of primitive or transitive permutation groups.

Methods.

These and many more concrete computations were made possible by a large number of new methods and their integration into general and specialised systems.

I have mentioned already the pp-group generation method, which builds on the pp-Nilpotent Quotient Algorithm (pNQ) and links to the recent exploration of the possibility to classify families of pp-groups of constant coclass with the help of space groups [38]. Another very recent offspring of the pNQ is a (proper) Nilpotent Quotient Algorithm stepping down the lower central series of a finitely presented group. A number of proposals have been made and a couple of them implemented recently for finding soluble factor-groups of finitely presented groups, using concepts such as cohomology groups, modular representations, or Gröbner bases [37], [55], [62].

Working via homomorphic images has become the method of choice for handling polycyclicly presented finite soluble groups [39] but also has become indispensable for handling permutation groups [33].

The methods for the investigation of permutation groups have almost undergone a revolution, bringing in structure theory such as the O'Nan-Scott classification of primitive groups or even the classification of finite simple groups, now e.g. allowing the determination of the composition series of groups of degrees in some cases up into the hundred thousands. It is particularly interesting to note that some of these new methods for permutation groups, which have now become very practical, too, first were brought in through rather theoretical discussions of the complexity of permutation group algorithms. ([46], [31], [34], [8], [7], [9], to mention just a small selection of many papers on this subject.)

A broad variety of methods, many interactive, are available for working with representations and characters [53], [40].

A long neglected, but now very rapidly growing branch of CGT are methods for the study of matrix groups over finite fields [52], making strong use of the Aschbacher classification [1].

Systems. A program system comprises various components: storage management, a problem-oriented language for both interactive use and for writing programs, that can call system functions (and possibly directly access data), a library of functions that can be applied to the objects studied, and libraries of such objects. The first general system for CGT, developed in the mid-70-ties, was the Aachen-Sydney Group System, for which J Cannon provided storage management (stackhandler) and language (Cayley). Functions were written in Fortran. This system developed into the Cayley System [5], the functions of which were semi-automatically translated to C about 1987. Cayley has been remodelled recently under the name MAGMA [CP93] and extended to a general Computer Algebra system reaching beyond CGT.

Another general system for CGT called GAP (Groups, Algorithms, and Programming) was started in Aachen in 1986. Its design is influenced by the computer algebra system Maple: GAP has a kernel, written in C, containing storage management, language interpreter (in the future also a compiler ) for the GAP language and basic time-critical functions. The large majority of the GAP functions is written in the GAP language and so can be understood, checked and altered, a point on which I want to comment in the last section.

Both Cayley and GAP have found wide use. With Cayley more exact figures can be given, since it must be licensed for a certain fee, J Cannon reports over 200 licenses by 1990 [13]. GAP can be obtained free of charge via ftp and handed on so that no full control of its spread exists. An indication are the about 240 members of the GAP-forum and more than 350 reports of installation that we obtained.

During the late 70s and the 80s a number of very specialised systems have been written which became absorbed or outdated by Cayley and GAP, some others complement the scopes of the two general systems, among them Quotpic [28], giving a very good visualisation for the steps in the calculation of factor groups of finitely presented groups, MOLGEN [22] for the application of group theoretical methods to the construction of graphs, in particular representing the structure of organic compounds, and MOC [40] for the construction of modular character tables.

Publicity.

Already the demand for CGT systems indicates the widespread use of algorithmic methods in research (and increasingly also teaching) of group theory. The presence of CGT in general meetings on group theory, such as all four Groups St Andrews meetings 1981, 1985, 1989, and now 1993, as well as the growing frequency of specialised meetings on CGT are further indications. While the first meeting fully devoted to CGT was held in Durham 1982 [2], further ones were Oberwolfach 1988 and 1992, DIMACS 1991 [21] and 1994. In addition to a number of surveys (see in particular [15] for a recent one with many references) two monographs on parts of the field have appeared recently. While Greg Butler [10] tries to introduce Computer science students with no pre-knowledge of group theory to computational methods for permutation groups, Charles Sims [63] gives an authoritative account of methods for the investigation of finitely presented groups, emphasising common features of various approaches.

A review of Sims' Computation with finitely presented groups by Joachim Neubüser is at THIS LINK.

2. Some concerns.

Summing up what has been reported in the preceding sections, one may come to very optimistic conclusions. Computational Group Theory has provided evidence of its power, many of its methods are generally and comfortably available and widely used. And for the future: Computers are getting faster, storage space bigger, both cheaper, methods better, program systems more comprehensive and easier to use. So the future is all gleaming with promise! Is it? I have some concerns.

  1. I mentioned 1967, the year of the Oxford Conference, as the time of breakthrough for CGT. That same year Huppert's book "Endliche Gruppen I" appeared. It is still on my shelf and as useful as 27 years ago, while none of the programs that we proudly talked about in Oxford is running any more. Of course Huppert built on more than hundred years of research in Group Theory and almost hundred years experience of writing books about it, while Computational Group Theory and the art of implementing its methods were still in their infancy. However, I have to admit: While Huppert's book will most likely still be on your shelves in 2021, after the next 27 years, one must have serious doubts if you will still be able to use GAP (or MAGMA) then. Computers and computer languages are still changing rapidly. It is, for instance, very much in vogue to bet on parallel computers as the tool of the future, but it is by no means clear to me which of the many models of parallel computers and operating systems and languages for them will make it. And we are still lacking safe methods to preserve the huge amount of work that has gone and is going into the development of systems such as GAP or MAGMA for even a foreseeable time span. We do not even have defined standards to save the mathematical facts - character tables, group classifications etc. - that have already been created by use of CGT in a way that guarantees for many years the possibility to use and to check them.

  2. I had the privilege of getting involved with the proofreading of Huppert's book. Almost every page has been read, rewritten and reread several times over with immense care. Nevertheless the last edition of the book has almost three pages of errata. I have to admit for GAP that we did not have the manpower and time to do checking of any comparable intensity and I doubt that the situation is much better with other systems. But worse: On page 128 Huppert's book refers to the Schreier conjecture as to the "Schreibweise Vermutung". A typo like that in a textbook causes at best a smile by the reader, in a program it may cause some unpredictable action of the computer.

    To make one point clear: This is not reviving the old prejudices that one cannot trust results obtained by computer calculations. Modern computers are by orders of magnitude more reliable in doing computations by rules than human brains. When we talk about bugs we talk about human mistakes made in setting up these rules (i.e. programs) which are of exactly the same nature as mistakes that occur in proofs. In program systems such as GAP with presently about 50 000 lines of C-code in the kernel, 120 000 lines of GAP-code in the program library, and a manual of about 1000 pages, bugs are practically unavoidable. Among these, those that cause the system to crash or produce obvious nonsense are annoying, the real dangerous ones are those that produce wrong output that still looks possible or even plausible at first sight. What can be done about this by system developers?

    Programs have always been tested by running examples, it is a problem of mathematical insight and imagination to choose a sufficiently representative set of such test examples and a problem of manpower to choose it big enough. We will come back to this with point (iv).

    Use of modern programming languages that allow or even enforce more transparent implementation of algorithms has certainly helped very much to avoid and to find bugs - in the same way as modern standards of formalising and formulating proofs have done this with writing mathematics. However it does not totally eliminate the problem - again the same has to be said of standards of writing proofs.

    Finally on this topic: Providing corrections to a book is an easy matter: I use my 1967 copy of Huppert's book with a copy of those 3 pages of errata; if I want, I just need a pencil to mark them on the margin. What about removing bugs from a program? If source code is interpreted, it is still reasonably easy to apply a patch. If a (compiled) executable is used, either, if available, the source must be corrected and then recompiled or, if not, a new executable must be gotten and installed. Each is a much more cumbersome procedure that needs much more assistance from the side of the system developer.

  3. You can read Sylow's Theorem and its proof in Huppert's book in the library without even buying the book and then you can use Sylow's Theorem for the rest of your life free of charge, but - and for understandable reasons of getting funds for the maintenance, the necessity of which I have pointed out in (i) and (ii) - for many computer algebra systems license fees have to be paid regularly for the total time of their use. In order to protect what you pay for, you do not get the source, but only an executable, i.e. a black box. You can press buttons and you get answers in the same way as you get the bright pictures from your television set but you cannot control how they were made in either case.

    With this situation two of the most basic rules of conduct in mathematics are violated: In mathematics information is passed on free of charge and everything is laid open for checking. Not applying these rules to computer algebra systems that are made for mathematical research (as is the case practically exclusively with systems for CGT) means moving in a most undesirable direction. Most important: Can we expect somebody to believe a result of a program that he is not allowed to see? Moreover: Do we really want to charge colleagues in Moldova several years of their salary for a computer algebra system? And even: If O'Nan and Scott would have to pay a license fee for using an implementation of their ideas about primitive groups, should not they in turn be entitled to charge a license fee for using their ideas in the implementation?

  4. The preceding discussion describes a dilemma. On one hand CGT systems are as useful, in fact even more indispensable, for giving algorithmic ideas an impact on the progress of our knowledge on concrete groups as are books for the dissemination of theoretical insight. On the other hand it should have become clear that development and maintenance of such systems pose more problems and involve more continuously needed manpower than writing a book. I hope also to have given good reason, why I do not think that license fees are a desirable solution. Most systems have obtained some support through research grants, - in the case of GAP we gratefully acknowledge such support by the Deutsche Forschungsgemeinschaft - but typically such support is given during a restricted time for original development rather than continuously for maintenance.

    I do not have a patent remedy for the problem but let me close by describing first what we try with GAP and then what in my view is needed on a broader scale if CGT is to continue to develop as well as it has since Oxford 1967.

  5. Our policy: Both, (C-) kernel and GAP library of functions are distributed with full source, free of charge, through anonymous ftp. We provide patches at intervals of a few months and release new versions of the system about every 9-12 months. Some C-programs written by other teams, devoted to special problems and tuned for these to high efficiency, are linked to GAP as 'share libraries' that can be called from GAP but remain under the responsibility of their authors.

    We maintain an electronic 'GAP-forum' not only for discussion of the use of GAP in research and teaching but also (and at least as important) for bug reports of users that are thus made generally known immediately, and we encourage users to join the forum. In addition, being helped by some experienced users, we try to provide advice with technical problems, e.g. installation problems, that are sent to an address 'GAP-trouble'.

    The kernel of GAP contains time-critical parts of the system, such as storage management, language interpreter and basic functions. It is clear that most users of GAP neither know nor want to know much about the methods used in the kernel, they rightly just want to rely on them. Therefore the kernel is kept as small as possible so that its development and maintenance can be managed by very few people who are highly experienced with system building (at present in the very first place Martin Schönert). On the other hand the fact that the much larger GAP library is written in the more transparent GAP language allows users to play an active part in the development of GAP. They can and should - as always when using a program - exercise all their expertise to check critically if results "look right", if they have doubts they can look at the code, trying to locate or even to correct mistakes, but in any case they can and should relate their doubts through the GAP-forum to the whole community of users and to the developers of the system. In fact this scheme has worked quite well during the last few years and has helped considerably improving the reliability of GAP.

    It goes almost without saying that the relative ease of reading and writing GAP language also gives users a much better chance to adapt an existing function or to write a new one for their particular problem and in fact this is also done frequently now. It would be most desirable, however, if in many more cases some extra effort would be spent in preparing such "private" programs for general use and making them available to the public. Again with GAP we try to provide some organisational help for such publicising of programs. Of course the ability to control, adapt, or amend functions in GAP presupposes at least some basic knowledge about group theoretical algorithms, which should therefore start to enter group theory courses, not at all to replace but to complement some parts of the theory.

  6. I have emphasised the central role that we attribute to cooperation in our 'GAP policy'. There still remains a large amount of work to be done by rather few who develop and maintain GAP. It is important to realise that such work, if it is to live up to the state of the art, must be performed in close contact with the progress of group theory and with group theoretical problems that are presently studied. Progress in CGT in scope but also in efficiency of implementations has come far more from a better understanding of the underlying mathematics than from better implementation techniques. (These rather have their importance for reliability, flexibility, and portability of the code.) That is, the development of a group theory system is a job for group theorists (some of whom are also good at system programming) rather than for professional system programmers. But then it must also be realised that the work of such people must be recognised as contributing to mathematics. I still often enough see papers by authors who for this paper have successfully used GAP (or MAGMA or some other systems) and refer to this use by saying: "... and then by computer calculation we obtained ...". Would you quote a theorem from a paper that has an author and a title by saying: "... and then by browsing through our library we became aware of the following fact ..."? Of course if we ask to change this habit, on the part of the system developers we have to be more careful in attributing contributions to individuals (with GAP we try at least) and also we have to find ways of certifying contributions to systems similar to established methods with publishing papers.
Summing up, in my view it is necessary to avoid everything that would separate work on the design and implementation of algorithms from other mathematical work. Rather we have to make every possible effort to adjust to habits and to adopt rules of conduct that are common practice in other parts of mathematics. For this Computational Group Theory needs the closest cooperation with all other parts of Group Theory. It is this that the title of the paper wants to express: I want to invite you to Computational Group Theory, which should not be considered as a group theoretical fool's paradise where shiny black boxes spit out character tables and cohomology groups, rather it should be considered as a field that needs a lot of tending, but is also worth your help in tending it.

References (show)

  1. M Aschbacher, On the maximal subgroups of the finite classical groups, Invent. Math. 76 (1984), 469-514.
  2. M D Atkinson (ed.), Computational Group Theory, Durham, 1982 (Academic Press, 1984).
  3. G Baumslag, Topics in Combinatorial Group Theory, Lectures in Mathematics, ETH Zürich (Birkhauser, 1993).
  4. H Brown, R Biilow, J Neubüser, H Wondratschek and H Zassenhaus, Crystallographic groups of four-dimensional space (Wiley-Interscience, New York, 1978).
  5. W Bosma and J Cannon, A handbook of Cayley functions (Computer Algebra Group, Sydney, Australia, 1991).
  6. L Babai, G Cooperman, L Finkelstein, E Luks,and A Seress, Fast Monte Carlo algorithms for permutation groups, J. Comp. Syst. Sci. 50 (2) (to 1995), 296-308.
  7. L Babai, G Cooperman, L Finkelstein and A Seress, Nearly linear time algorithms for permutation groups with a small base, in Proc. International Symposium on Symbolic and Algebraic Computation, Bonn (ISSAC '91) (1991), 200-209.
  8. L Babai, E Luks and A Seress, Fast management of permutation groups, in Proc. 29th IEEE Symp. on Foundations of Computer Science (1988), 272-282.
  9. R Beals and A Seress, Computing composition factors of small base groups in almost linear time, in Proc. 24th ACM Symp. on the Theory of Computing (1992), 116-125.
  10. G Butler, Fundamental algorithms for permutation groups, Vol. 559 Lecture Notes in Computer Science (Springer, Berlin, 1991).
  11. J Cannon, A general purpose group theory program, in Newman [48], 204-217.
  12. J Cannon. A draft description of the group theory language Cayley, in Jenks [30], 66-84.
  13. J Cannon, A bibliography of Cayley citations, SIGSAM Bulletin 25 (1991). 75-81.
  14. J H Conway, R T Curtis, S P Norton, R A Parker and R A Wilson, ATLAS of finite groups (Oxford University Press, 1985).
  15. J Cannon and G Havas, Algorithms for groups, The Australian Computer Journal 27 (1992), 51-60.
  16. F Celler, J Neubüser and C R B Wright, Some remarks on the computation of complements and normalizers in soluble groups, Acta Applicandae Mathematicae 21 (1990), 57-76.
  17. S Comet, On the machine calculation of characters of the symmetric group, in M Riesz (ed.), Tolfte Skandinaviska Matermatikerkongressen, Lund, 1954. Hakan Ohlssons Boktryckeri, Lund 1954), 18-23.
  18. J Cannon and C Playoust, An Introduction to MAGMA (School of Mathematics and Statistics, University of Sydney, 1993).
  19. M Dehn, Über unendliche diskontinuierliche Gruppen, Math. Ann. 71 (1911), 116-144.
  20. J D Dixon, High speed computation of group characters, Numer. Math. 10 (1967), 446-450.
  21. L Finkelstein and W M Kantor (eds.), Groups and computation, Proc. DIMACS Workshop, October 1991 (AMS-ACM, 1993).
  22. R Grund, A Kerber and R Laue, MOLGEN, ein Computeralgebra-System für die Konstruktion molekularer Graphen, Com. Math. Chern. 27 (1992), 87-131.
  23. G Havas, A Reidemeister-Schreier program, in Newman [48], 347-356.
  24. G Hiss and K Lux, Brauer trees of sporadic groups (Oxford University Press, 1989).
  25. G Havas, M F Newman and M R Vaughan-Lee, A nilpotent quotient algorithm for graded Lie rings, J. Symbolic Computation 9 (5-6) (1990), 653-664.
  26. A Hodges. Alan Turing, The Enigma of Intelligence (Unwin Paperbacks, 1985).
  27. D F Holt and W Plesken, Perfect groups (Oxford University Press, 1989).
  28. D F Holt and S Rees, A graphics system for displaying finite quotients of finitely presented groups, in Finkelstein and Kantor [21], 113-126.
  29. A Hulpke, Zur Berechnung von Charaktertafeln. Diplomarbeit (Lehrstuhl D für Mathematik, Rheinisch Westfälische Technische Hochschule, 1993).
  30. R D Jenks (ed.), SYMSAC 1976, Yorktown Heights, New York (Association Computing Machinery, New York, 1976).
  31. W Kantor, Sylow's theorem in polynomial time, J. Comput. Syst. Sci. 30 (1985), 359-394.
  32. D E Knuth and P B Bendix, Simple word problems in universal algebras, in Leech [36], 263-297.
  33. W Kantor and E M Luks, Computing in quotient groups, in Proceedings of the 22nd ACM Symposium on Theory of Computing, 1990 (1990), 524-534.
  34. W Kantor and D Taylor, Polynomial-time versions of Sylow's theorem, J. Algorithms 9 (1988), 1-17.
  35. J Leech, Coset enumeration on digital computers, Proc. Cambridge Philos. Soc. 59 (1963), 257-267.
  36. J Leech (ed.), Computational Problems in Abstract Algebra, Oxford, 1961 (Pergamon Press, Oxford, 1970).
  37. C R Leedham-Green, A soluble group algorithm, in Atkinson [2], 85-101.
  38. C R Leedham-Green and M F Newman, Space groups and groups of prime-power order I, Arch. Math. 35 (1980), 193-202.
  39. R Laue, J Neubüser and U Schoenwaelder, Algorithms for finite soluble groups and the SOGOS system, in Atkinson [2], 105-135.
  40. K Lux and H Pahlings, Computational aspects of representation theory of finite groups, in G O Michler and C M Ringel (eds.), Representation Theory of Finite Groups and Finite-Dimensional Algebras, Vol. 95 Progress in Mathematics (Birkhauser, 1991), 37-64.
  41. I D Macdonald, A computer application to finite p-groups, J. Australian Math. Society 17 (1974),102-112.
  42. N S Mendelsohn, Defining relations for subgroups of finite index of groups with a finite presentation, in Leech [36], 43-44.
  43. M Mecky and J Neubüser, Some remarks on the computation of conjugacy classes of soluble groups, Bulletin of the Australian Mathematical Society 40 (1989), 281-292.
  44. J Neubüser, Untersuchungen des Untergruppenverbandes endlicher Gruppen auf einer programmgesteuerten elektronischen Dualmaschine, Numer. Math. 2 (1960), 280-292.
  45. J Neubüser, Investigations of groups on computers, in Leech [36], 1-19.
  46. P M Neumann, Some algorithms for computing with finite permutation groups, in E F Robertson and C M Campbell (eds.), Proceedings of Groups - St. Andrews 1985, London Math. Soc. Lecture Note Ser. 121 (Cambridge University Press, 1986), 59-92.
  47. M H A Newman, The influence of automatic computers on mathematical methods, in F C Williams (ed.), Manchester University Computer Inaugural Conference, Manchester, 1951 (Tillotsons (Bolton) Ltd., Bolton, England
  48. M F Newman (ed.), Second International Conference on the Theory of Groups, Canberra, 1973, Lecture Notes in Math. Vol. 372 (Springer, Berlin, 1974).
  49. M F Newman, Calculating presentations for certain kinds of quotient groups, in Jenks [30], 2-8.
  50. M F Newman, Determination of groups of prime-power order, in R A Bryce, J Cossey and M F Newman (eds.), Group theory, Proc. Miniconf., Austral. Nat. Univ., Canberra
  51. S Norton, The construction of J4, Proc. of Symp. in Pure Math. 37 (1980), 271-277.
  52. P M Neumann and C E Praeger, A recognition algorithm for special linear groups, Proc. Lond. Math. Soc. 65 (1992), 555-603.
  53. J Neubiiser, H Pahlings and W Plesken, CAS; design and use of a system for the handling of characters of finite groups, in Atkinson [2], 195-247.
  54. E O'Brien, The groups of order 256, J. Algebra 143 (1991), 219-235.
  55. W Plesken, Towards a soluble quotient algorithm, J. Symbolic Computation 4 (1987), 123-127.
  56. E T Parker and P J Nicolai, A search for analogues of Mathieu groups, Math. Tables Aids Comput. 12 (1958), 38-43.
  57. M Schonert et al., GAP 3.3 (Lehrstuhl D für Mathematik, Rheinisch Westfälische Teehnische Hochschule, 1993).
  58. G J A Schneider, Dixon's character table algorithm revisited, J. Symbolic Computation 9 (5-6) (1990), 601-606.
  59. C C Sims, Computational methods in the study of permutation groups, in Leech [36], 169-183.
  60. C C Sims, The existence and uniqueness of Lyons' group, in T Gagen, M P Hale and E E Shult (eds.), Finite groups '72, Proceedings of the Gainesville conf., Univ. of Florida, Gainesville
  61. C C Sims, How to construct a baby monster, in M J Collins (ed.), Finite simple groups II, Proc. Symp., Durham, 1918 (Academic Press, 1980)
  62. C C Sims, Implementing the Baumslag-Cannonito-Miller polycyclic quotient algorithm, J. Symbolic Computation 9 (5-6) (1990), 707-723.
  63. C C Sims, Computation with finitely presented groups (Cambridge University Press, 1994).
  64. J A Todd and H S M Coxeter, A practical method for enumerating cosets of a finite abstract group, Proc. Edinburgh Math. Society (2) 2 (1936), 26-34.
  65. H Zassenhaus, Über einen Algorithmus zur Bestimmung der Raumgruppen, Comment. Math. Helv. 21 (1948), 117-141.

Written by Joachim Neubüser
Last Update February 2023