Computers are widely believed to be perfect in carrying these two tasks: computation and communication. They are also expected to be somehow reliable in carrying intelligent tasks like recognition and inference.
But technically, there are still some important remaining challenges for computers to cope with. For example, in computation, computers are still quite weak in solving some important problems like protein folding. In communication, although we use large scale social networks and employ some machine learning techniques for analyzing the culminated Big Data, we do not have an effective way to connect people to each other which could be resulted in the society & economy expansion. In recognition, we have used many deep learning systems but we are still far from a human-level recognition system.
Facing the above challenges, we think we would make some changes in our current research approaches as the dominant approaches used in the first two decades of the 21st century are more integrative than being innovative. To this end and for bringing more innovation to the field, we pursue the following schemes in our research projects:
1. In computation and communication, we follow a coordinative approach in stead of the traditional calculative approach. In the proposed approach, the attribute/functionality of information/processors depends on their physical or logical location. As a good example for the coordinative approach, please consult this short paper from our Appetizer series.
2. In cognition and recognition, we try to establish some models for objectivity, in stead of dealing with the current mathematical models of similarity (such as Bayesian Net, Hidden Markov Model, and Fuzzy Logic). In our proposed models, the behavior of an object is solely depends on its structure. As a sample and primitive model for objectivity, please consult this conference paper from SPIE.
Although the above schemes are somehow non-traditional, we expect them to generate some notable advances in the field in both of the theoretical and applied issues. For the people who are interested in philosophy, we should cite that our schemes have their own roots to an ontological viewpoint which is called multiplicity in unit. Among the well-known philosophers who have explained this viewpoint in their works, we can cite these two great 17th century philosophers: Molla-Sadra and Leibniz. We also suggest you to read the masterpiece of Leibniz, The Monadology, as a concise explanation of this ontology.
There are two major research programs at ATG. The Orange Program, which is devoted to carry some basic research projects and make some substantial improvements in all the field. The other program is Appetizer Program, which is devoted to carry some applied research projects and make some useful improvements in the current trends of IT industry. Here is a list of research projects at Orange and Appetizer programs. This is not an exhaustive list, but it covers our major research projects.
A. Orange Program
This program is deeply rooted in our early days research manifest, when we started some alternative thinking in computer science in the 1990s. The results of that alternative thinking were some interesting achievements. Later, that alternative thinking formed our above mentioned schemes which are now the base of our Orange Program. The following is the list of some of key projects in this program.
1. Passive Computation
The basic factor for evaluating every computational model is the resources that the model needs for computation. The main independent resources in conventional approaches include time and space. Other resources such as energy depend on time-space and are proportional to them. In the case of computationally hard problems, i.e. NP problems, the required resources (or at least one of them) grow exponentially while the scale of problem grows linearly.
In a recent approach, i.e. quantum computing, it seems that the required resources do not grow exponentially because of the hybrid nature of the qubit, which is referred to as superposition. But it is still unclear that how much energy is needed for isolating the required number of qubits from environment in order to prevent the problem of decoherency. If the amount of this energy would not be a polynomial function of total qubits, then there is no advantage in this approach.
In 1997, an optical computation model for an NP-C problem, the TSP, was presented in which the required time and space grow linearly with the size of the problem while the required energy grows exponentially. In the conventional computation model (Turing machine), it is not possible that a machine needs an exponential amount of energy while it uses a polynomial amount of time and space. We call our proposed computational model passive machine because of using optical passive elements in its structure.
In this project, we classify the existing computational models in terms of the independent resources they need for computation. This classification is as follows:
1. Turing (super-Turing) machine: independent time and space, energy is proportional to time and space.
2. Quantum computer: independent time and space, the relation between energy and time-space is still unclear (because of the problem of decoherency).
3. Passive machine: independent time and space and energy.
Remarks: in May 2002, a similar method to the ATG optical computing model has been proposed for solving the shortest path problem (not the TSP). Although it is not currently effective for the TSP, but it is perfect for the shortest path problem, and more important, it is the second ever built passive machine (after ATG optical machine in 1997), which have been announced yet.
It is our pleasure to see that the domain of research on passive machine has been expanded out of ATG to some other well-known academic institutions like Imperial College London and Harvard University. We hope to see some serious works on passive machine in industrial section, too.
2. Theory of Need
The question of what is a living thing is one of the oldest and most interested topics in both of philosophy and science. Many philosophers and scientists have addressed this question, but the one that is a dominating paradigm in computer science was originated in the Von Neumann's works. His approach was a bottom-up view to the problem which distinguishes the basic characteristics of a living thing such as reproduction and metabolism as its distinction factors from non-living things.
Now the question is if it is possible to find a top-down approach to the problem? We think it is possible and if we want to define a living thing in this way, we reach a definition like this: a living thing is an entity which needs other entities to be existed. We refer to this simple definition as the theory of need. Our primary research works show this approach is a very promising one with a range of applications from cognitive science and artificial intelligence to synergy and social science. We expect our works toward this approach would result some notable improvements in a few scientific areas in addition to computer science.
3- Behavioral Macroeconomics
The conventional macroeconomic thought is based on its underlying historical and social situations. That means different behaviors in a society would be resulted in different macroeconomic models & explanations. In other words, macroeconomic thought has not been shaping separately from the behavior of its underlying society.
But now the question is, can we consider the behavior of the underlying society as part of our macroeconomic models and not as its underlying fact? That means in our new thought, the behavior is not the underlying fact of a society, rather it can be modeled knowing the requirements and desires of macro & micro entities in that society.
At first glance, the above approach seems like an exhaustive method to the modeling of society behavior, because we should consider the behaviors of too many entities in the society. But fortunately, our primary research shows that there are some key requirements & desires in a society that can explain the whole behavior of that society. So we do not need to consider the integration of micro-behaviors to reach the whole behavior of a society. We expect the definition of these key requirements & desires could help us effectively to build more general & more accurate macroeconomic models.
B. Appetizer Program
Appetizer Program is a more relaxed research program. We would not necessarily follow the above mentioned schemes in this program. The goal of the program is to contribute short and medium term improvements to the field. So we may use any simple but effective idea to develop research products which we think could be helpful to the field. These products are in the forms of papers, research kits, and developers kits.
As the research projects in this program are short terms, we do not have a project listing here, rather we have a listing of the resulted products in the corresponding sections. The resulted research paper of Appetizer Program are published here, the research kits here, and the developers kits here.