Research Article | | Peer-Reviewed

An Efficient and Deterministic Object Recognition System

Received: 27 February 2026     Accepted: 11 March 2026     Published: 23 March 2026
Views:       Downloads:
Abstract

This paper presents an efficient deterministic framework for small-scale object recognition based on geometric normalization and invariant-driven elimination. The proposed method avoids training procedures and iterative optimization, operating instead through structured preprocessing, scalar invariant filtering, and deterministic vector comparison within a compact single-table vector database. Each object is represented as a feature vector derived from a fixed 7×7 grid and associated geometric descriptors. Recognition proceeds through a three-stage pipeline: (i) centroid-based translation and scale normalization, (ii) invariant-based candidate reduction using mass and second-order spatial moments, and (iii) localized orientation refinement followed by cosine similarity matching. Invariant-driven preselection significantly reduces the candidate set prior to fine comparison, improving computational efficiency while avoiding exhaustive rotational search. The method is lightweight, reproducible, and free of stochastic components. Experimental validation on a 100-object prototype library demonstrates stable and accurate identification under scale variation and controlled rotational perturbations. The results indicate that for structured, low-dimensional pattern libraries, deterministic geometric reasoning combined with finite database ordering provides a practical and computationally controlled recognition mechanism. The proposed framework is designed for structured and finite pattern libraries where objects can be represented through compact grid-based descriptors. In such environments, deterministic geometric preprocessing and invariant filtering significantly reduce the search space before final similarity evaluation. This allows recognition to be completed in a single deterministic evaluation cycle without iterative optimization or training procedures. The approach therefore provides a computationally lightweight alternative for embedded systems, symbolic recognition tasks, and controlled pattern libraries where structural compatibility can be exploited effectively.

Published in American Journal of Artificial Intelligence (Volume 10, Issue 1)
DOI 10.11648/j.ajai.20261001.23
Page(s) 148-157
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2026. Published by Science Publishing Group

Keywords

Geometric Normalization, Invariant Filtering, Cosine Similarity, Finite Compatibility Ordering, Spectral Gap, Rotation Handling, Moment Invariants, Single-table Vector Database

1. Introduction
Object recognition is commonly addressed through statistical learning frameworks, including convolutional neural networks and gradient-based optimization methods. Comprehensive surveys of statistical pattern recognition describe the dominance of learning-based approaches in modern systems . While such systems are powerful in complex and high-variability environments, they typically require training procedures, iterative parameter updates, and substantial computational resources.
Before the widespread adoption of deep learning, object recognition problems were frequently formulated using deterministic geometric descriptors and distance-based classification rules. Foundational work by Hu introduced moment invariants capable of providing robustness under translation, rotation, and scale transformations . Deterministic decision rules and minimum-distance classifiers were systematically developed within classical pattern classification frameworks . These approaches established recognition as a structured comparison problem over finite feature representations rather than as an optimization trajectory.
Geometric normalization techniques—including centroid alignment, covariance-based principal axis estimation, and region descriptors—are extensively documented in standard image processing literature . These methods provide efficient preprocessing mechanisms that avoid exhaustive rotational search. In particular, covariance-based orientation estimation enables deterministic alignment via second-order spatial moments. Template matching and normalized correlation measures further established cosine similarity as a computationally effective deterministic similarity metric .
In constrained environments—such as symbol recognition, embedded systems, or compact grid-based representations—a deterministic alternative can provide predictable behavior, reduced computational cost, and architectural transparency. Motivated by classical geometric reasoning, this work proposes a deterministic object recognition framework tailored for structured, low-dimensional pattern libraries.
Objects are stored in a single-table vector database in which each pattern is represented by a fixed-length row-wise flattened feature vector. This representation eliminates the need for relational joins, hierarchical feature trees, or learned convolutional filters. Identification proceeds through structured elimination rather than exhaustive comparison. The central idea is to reduce the candidate space using inexpensive geometric invariants prior to performing final vector similarity evaluation.
The proposed pipeline consists of three principal stages:
1) Geometric normalization, including centroid translation and scale adjustment;
2) Invariant-based elimination using total mass and second-order spatial energy to reduce the candidate set;
3) Localized orientation refinement followed by cosine similarity comparison for final identification.
Rotation is handled deterministically through covariance-based principal direction estimation, thereby avoiding full 360° brute-force sweeps. Rotation-invariant scalar descriptors collapse the search space prior to fine comparison, enabling efficient and reproducible matching.
The recognition problem is therefore reformulated as a controlled finite ordering task over a compact compatibility spectrum. Rather than approaching a solution asymptotically through iterative descent, identification is achieved by selecting the maximal element of a finite similarity set:
k̂= arg maxₖsₖ.
For structured libraries and low-dimensional representations, deterministic geometric reasoning provides an effective and computationally controlled recognition mechanism grounded in compatibility ordering rather than optimization dynamics.
2. Related Work
Modern object recognition systems are predominantly formulated within statistical learning frameworks. Comprehensive surveys describe the transition from classical rule-based classifiers to data-driven models trained through optimization procedures . In contemporary practice, convolutional neural networks (CNNs) and gradient-based optimization methods dominate large-scale visual recognition tasks due to their ability to model high intra-class variability and complex feature hierarchies .
These systems operate by minimizing empirical loss functions through iterative parameter updates. Stability and generalization are governed by optimization dynamics, regularization mechanisms, and training data distribution. While highly effective in unconstrained environments, such approaches typically require substantial training datasets, computational resources, and hyperparameter tuning.
Prior to the widespread adoption of deep learning, object recognition was frequently addressed through deterministic geometric descriptors and explicit distance-based decision rules. Moment invariants introduced in provided rotation, translation, and scale robustness without iterative learning. Extensions and generalizations of moment-based descriptors further established invariant feature extraction as a principled deterministic alternative .
Classical pattern classification theory formalized minimum-distance and linear decision rules within finite-dimensional feature spaces . In these formulations, recognition is achieved by evaluating compatibility measures rather than by optimizing high-dimensional parameter landscapes. Template matching and normalized correlation techniques later demonstrated that cosine similarity can serve as an efficient deterministic similarity metric for structured patterns .
Geometric preprocessing techniques—such as centroid alignment, covariance-based principal axis estimation, and region descriptors—are extensively documented in image processing literature . These methods enable alignment and normalization without exhaustive rotational sweeps, thereby reducing computational complexity prior to classification.
The present work does not aim to compete with large-scale learning systems designed for high-variability natural images. Instead, it revisits and systematizes deterministic geometric reasoning for structured, low-dimensional pattern libraries. The emphasis shifts from optimization trajectories to finite compatibility ordering. Recognition is treated as a well-posed decision problem over a compact similarity spectrum rather than as the convergence of an iterative learning process.
In this sense, the proposed framework is positioned within the classical deterministic tradition of pattern recognition, while incorporating structured invariant filtering to reduce computational effort before final similarity evaluation.
The objective of the present study is not to provide a benchmark comparison with statistical or learning-based recognition systems, but to establish the deterministic recognition framework and demonstrate its structural mechanism under controlled conditions.
3. Methodology
Unlike classical template matching, which performs exhaustive similarity comparison across all stored patterns, the proposed deterministic framework reduces the candidate set through invariant-based elimination prior to the final cosine similarity evaluation.
3.1. Library Construction and Database Structure
Each object is represented as an N × N real-valued matrix
A⁽ᵏ⁾∈ℝᴺˣᴺ,k = 1, …, M(1)
For the prototype system, N = 7 and M = 100.
Each matrix is flattened using a consistent row-wise ordering into a vector
v⁽ᵏ⁾∈ℝᴺ²(2)
defined by
v⁽ᵏ⁾ = vec(A⁽ᵏ⁾)(3)
For N = 7,
v⁽ᵏ⁾∈ℝ⁴⁹(4)
Each object is assigned a unique identifier (UID).
The library is stored in a single-table vector database of the form:
UID | f₁ | f₂ | … | fᴺ²(5)
where fᵢ denotes the i-th flattened pixel value.
Optional scalar invariants (mass, energy) may be stored as additional columns.
3.2. Library Matrix Form
All object vectors are stacked column-wise to form the deterministic library matrix
For the 7×7 prototype,
L∈ℝ⁴⁹ˣ¹⁰⁰(6)
Recognition is therefore reduced to structured matrix–vector operations.
3.3. Blind Object Representation
A query object is represented as
B∈ℝᴺˣᴺ(7)
Flattened into
b∈ℝᴺ²(8)
b = vec(B)(9)
3.4. Geometric Normalization
The blind object undergoes deterministic preprocessing.
Mass (zeroth moment):
M₀ = ∑ᵢ bᵢ(10)
Centroid:
xc=(∑ᵢ xᵢ bᵢ)/M₀,yc=(∑ᵢ yᵢ bᵢ)/M₀(11)
Translation to centroid:
X′ = X − xc(12)
Y′ = Y − yc(13)
Scale normalization:
x = λ X′(14)
y = λ Y′(15)
where λ ensures containment within the fixed N × N grid.
If rotation handling is required, orientation is estimated via the covariance matrix
C=μ₂₀μ₁₁μ₁₁μ₀₂(16)
The principal direction is obtained from the eigenvectors of C.
Only a small angular vicinity is evaluated, avoiding exhaustive rotation.
3.5. Deterministic Elimination Stage
Two scalar invariants are computed:
Mass:
M = ∑ᵢ bᵢ(17)
Radial energy:
E = ∑ᵢ (xᵢ² + yᵢ²) bᵢ(18)
Preselection is performed using tolerance windows:
| M − M⁽ᵏ⁾ | ≤ δ(19)
| E − E⁽ᵏ⁾ | ≤ ε(20)
This reduces the candidate subset prior to full similarity evaluation.
3.6. Deterministic Matching
Each library vector is normalized once:
ṽ⁽ᵏ⁾ = v⁽ᵏ⁾ /∥v⁽ᵏ⁾∥(21)
The blind vector is normalized:
b̃ = b /∥b∥(22)
Cosine similarity is computed as:
sₖ = b̃ᵀ ṽ⁽ᵏ⁾(23)
In matrix form:
s = Lᵀ b̃(24)
where s ∈ ℝᴹ.
Recognition is performed by:
k̂= arg maxₖsₖ(25)
The UID corresponding to k̂ is returned.
3.7. Structural Decision Operator and Well-Posedness
The identification stage is governed by the deterministic operator (25).
For a finite library of size M, define the compatibility spectrum
S = {s₁, s₂, …, sM}.(26)
Recognition reduces to selecting the maximal element of S, because S is finite, a maximal element always exists. Therefore, recognition is not formulated as a search trajectory in parameter space but as a finite ordering problem over a discrete compatibility spectrum.
Uniqueness
Let sⱼ denote the similarity score between the query object and the j-th library element.
Let k* denote the true class index and k̂ denote the estimated index obtained by the maximization rule
k̂= arg maxₖsₖ.(27)
If there exists k* such that
sₖ* > sⱼfor all j≠k*,(28)
then the solution is unique and
k̂= k*(29)
Stability
Define the spectral gap
δ = sₖ* − maxⱼ≠ₖ* sⱼ.(30)
The quantity δ measures the separation between the correct class and its closest competitor. If δ > 0 and the compatibility scores depend continuously on the input representation, then perturbations smaller than δ⁄2 preserve the ordering of the scores. Consequently, the index k̂ remains invariant under sufficiently small input perturbations.
Existence follows from finiteness of S. Uniqueness follows from strict dominance. Stability follows from positive spectral separation.
Hadamard Well-Posedness
Existence follows from the finiteness of the score set S.
Uniqueness follows from strict dominance (28).
Stability follows from positive spectral separation δ > 0 and continuity of the score mapping.
Therefore, the deterministic decision problem is well-posed in the sense of Hadamard.
3.8. Deterministic Rotation Handling
In structured grid-based representations, rotational variability may degrade direct vector comparison. To avoid brute-force angular sweeps, orientation is estimated deterministically using second-order spatial moments.
Covariance matrix:
Σ = σxxσxyσxyσyy(31)
Principal axis orientation:
θ = (½) arctan(2σxy/ (σxx− σyy))(32)
The blind object is rotated by −θ for coarse alignment. Optional refinement within a small angular vicinity may be applied if necessary.
For symmetric patterns where the principal direction is ambiguous, orientation filtering is bypassed and structural compatibility is resolved directly via cosine similarity.
3.9. Efficient Telescopic Elimination
Recognition may be further accelerated through staged elimination.
Let
D = {S₁, S₂, …, SN}(33)
Each sample
Sᵢ∈ℝᵐ, where m = n × n.(34)
A partition operator
Pk: ℝᵐ → ℝᵐᵏ(35)
with
m = Σ mk(36)
defines structured components.
Sequential Elimination Structure
Initial candidate set;
C₀ = D.(37)
Here, D denotes the full library (database).
At stage k
Cₖ = { Sᵢ∈Cₖ₋₁ | dₖ(Sᵢ, Q) < τₖ }.(38)
where
Q is the query object,
Sᵢ are library elements,
dₖ(·,·) denotes the stage-k compatibility metric,
τₖ is the threshold at stage k.
Monotonic Reduction Property
Each stage reduces the candidate set:
Cₖ⊆Cₖ₋₁⊆…⊆C₀.(39)
If τₖ is properly chosen, then
|Cₖ| ≤ |Cₖ₋₁|.(40)
Strict inequality holds whenever at least one candidate is eliminated at stage k.
Cardinality Decay
|C₀| = N(41)
|C₁|≪N(42)
|C₂|≪|C₁|(43)
|CK| ≈ 1(44)
This produces geometric shrinkage rather than brute-force comparison.
Rotation, if required, is applied only after substantial reduction of the candidate set.
4. Numerical Example (3×3 Prototype)
To illustrate the deterministic recognition procedure in its most elementary algebraic form, consider a minimal 3×3 binary prototype library. This example serves as a complete structural sanity check of the formulation introduced in Section 3.
4.1. Library Construction
Let the three stored objects be denoted by X, T and +.
Their matrix representations are:
Figure 1. The matrices of the library.
Each object is flattened row-wise into a 9-dimensional column vector.
v⁽¹⁾ = [1 0 0 1 1 0 0 0 0]ᵀ(45)
v⁽²⁾ = [0 1 0 1 1 1 0 1 0]ᵀ(46)
v⁽³⁾ = [0 0 1 0 1 1 0 0 1]ᵀ(47)
The library matrix is constructed by stacking the vectors column-wise:
L = [v⁽¹⁾ v⁽²⁾ v⁽³⁾]∈ℝ⁹ˣ³(48)
Thus, each column of L corresponds to one stored object in the database representation.
4.2. Blind Object
Library matrix:
Figure 2. The library matrix and database table without UID.
After row-wise flattening:
b = [0 1 0 1 1 1 0 1 0]ᵀ.(49)
4.3. Elimination Using Mass
Define the mass (ℓ₁ content) of a binary object as
M = ∑ᵢ bᵢ(50)
where bᵢ ∈ {0,1} denotes the binary pixel value at index i.
For the blind object:
M = 5.(51)
The mass represents the total number of active pixels and constitutes a rotation- and translation-invariant scalar descriptor.
The stored objects have masses:
v¹= 3(52)
v² = 5(53)
v³= 3(54)
Using a strict mass window δ = 0, only object 2 satisfies the equality constraint. Therefore, deterministic elimination alone identifies the correct candidate before any similarity computation.
If elimination is intentionally bypassed, we proceed to cosine similarity evaluation.
4.4. Deterministic Cosine Matching
The Euclidean norms are:
‖v¹‖ =3(55)
‖v²‖ =5(56)
‖v³‖ =3(57)
‖b‖=5(58)
Normalized vectors:
ṽ⁽¹⁾ = v¹ /3,(59)
ṽ⁽²⁾ = v² /5,(60)
ṽ⁽³⁾ = v³ /3,(61)
b̃ = b /5.(62)
Since b = v⁽²⁾, cosine similarities are computed as
sₖ= b̃ᵀṽ⁽ᵏ⁾.(63)
Dot Products
v⁽²⁾• v⁽¹⁾= 2(64)
v⁽²⁾• v⁽²⁾= 5(65)
v⁽²⁾• v⁽³⁾= 2(66)
Thus,
s₁ = 2 /3  5≈ 0.516(67)
s₂ = 1(68)
s3= 2 /3  5≈ 0.516(69)
4.5. Recognition Result
The similarity spectrum is
s₁ ≈ 0.516(70)
s₂ = 1(71)
s₃ ≈ 0.516(72)
Therefore,
k̂ = arg maxₖ sₖ = 2(73)
The system deterministically identifies object 2 with maximum cosine similarity equal to 1.
Structural Interpretation
This prototype verifies:
1) Correct row-wise vectorization
2) Proper database matrix construction
3) Exact ℓ₂ normalization
4) Accurate cosine similarity computation
5) Deterministic arg max selection
No gradient descent, no loss minimization, and no probabilistic modeling are involved. The recognition emerges directly from algebraic structure.
5. 7×7 Structured Pattern Library and Experimental Evaluation
5.1. Library Definition
A structured library of 20 deterministic 7×7 binary patterns was constructed.
Each object A⁽ᵏ⁾ ∈ ℝ⁷ˣ⁷ was flattened row-wise into
v⁽ᵏ⁾∈ℝ⁴⁹.(74)
The library contains both symmetric and asymmetric geometric structures in order to create controlled structural diversity without randomness.
The patterns are defined as follows:
Figure 3. Structured 7×7 deterministic pattern library used for the experimental evaluation.
4. Filled Square — All entries equal to 1.
5. Vertical Bar — Column 4 = 1, all other entries 0.
6. Horizontal Bar — Row 4 = 1.
7. The figure is above.
8. Small Center Dot — Only (4,4) = 1.
9. Two Vertical Bars — Columns 2 and 6 = 1.
10. Two Horizontal Bars — Rows 2 and 6 = 1.
11. T Shape — Row 1 full + column 4 downward.
12. Inverted T — Row 7 full + column 4 upward.
13. I Shape — Column 1 + row 7.
14. Reverse L — Column 7 + row 7.
15.U Shape — Columns 1 and 7 + row 7.
16. Inverted U — Columns 1 and 7 + row 1.
17. Thick Diagonal — Three-pixel thick main diagonal.
18. Frame with Cross Inside — Hollow border + center cross.
19. 3×3 Filled Block Centered — Middle 3×3 region = 1.
20. Four Corners Only — (1,1), (1,7), (7,1), (7,7) = 1.
The library matrix is defined as
L = [v⁽¹⁾ v⁽²⁾ … v⁽²⁰⁾]∈ℝ⁴⁹ײ⁰(75)
Each object was stored in a single-table database structure with a unique identifier (UID).
No secondary tables or relational joins were used.
The design deliberately includes:
1) Purely symmetric structures
2) Orientation-sensitive asymmetric shapes
3) Sparse and dense patterns
4) Corner-only and frame-based geometries
ensuring controlled spectral and structural variation.
For clarity, the structured 7×7 pattern library used in the experiments is illustrated in Figure 3, providing a visual reference for the deterministic recognition process described in this section.
5.2. Stage I — Baseline Deterministic Validation
Each of the 20 stored objects was used once as blind input without rotation or perturbation.
Procedure:
1) Row-wise flattening
2) Unit Euclidean normalization
3) Cosine similarity computation
sₖ = b̃ᵀ ṽ⁽ᵏ⁾(76)
Decision rule:
k̂ = arg maxₖ sₖ.(77)
Result:
Correct identification was obtained for all tested instances.
This confirms:
1) Correct flattening
2) Correct database storage
3) Proper ℓ₂ normalization
4) Deterministic cosine evaluation
No stochastic element or optimization process was involved.
Recognition emerges directly from algebraic structure.
5.3. Stage II — Rotation Robustness Evaluation
To evaluate geometric robustness, rotated versions of each object were generated at ±10°.
The rotated images were re-sampled onto the 7×7 grid and provided as blind inputs.
Two deterministic strategies (Section 3.8) were evaluated.
Strategy A — Alignment-First
For each rotated input:
1) Centroid computation
2) Covariance matrix estimation
3) Principal-axis orientation computation
θ = (½) arctan(2σxy / (σxx − σyy))
4) Coarse rotation by −θ
5) Cosine similarity evaluation
No full 360° sweep was performed.
Observation:
For asymmetric patterns, orientation was correctly estimated and cosine similarity peaked at the correct index.
For symmetric patterns, orientation ambiguity did not affect recognition because cosine similarity remained discriminative under symmetry.
Strategy B — Elimination-First
For each rotated input:
Mass window filtering
Radial energy filtering
Optional partition-based elimination
Candidate reduction (20 → approximately 3–5 objects)
Orientation estimation applied only to reduced subset
Cosine similarity evaluation
Observation:
Significant candidate reduction was achieved before any rotational alignment was performed.
Recognition remained correct for all tested cases.
5.4. Efficiency Analysis
Without elimination, similarity is computed as
s = Lᵀ b(78)
requiring comparison with all 20 objects.
With invariant filtering, the candidate set shrinks geometrically:
|C₀| = 20(79)
|C₁|≪20(80)
|C₂| ≈ 3–5(81)
Rotation, when applied, is restricted to this reduced subset.
This constitutes structured shrinkage rather than brute-force search.
Structural Interpretation of the 7×7 Experiment
The experiment demonstrates:
1) Deterministic recognition without training
2) Rotation handling without angular sweep
3) Database shrinkage via structural invariants
4) Stability governed by spectral separation
Recognition is completed in a single evaluation cycle.
The framework operates through algebraic ordering and invariant filtering rather than iterative convergence.
The objective of this study is to establish and demonstrate the deterministic recognition framework rather than to benchmark large statistical datasets. For this reason, controlled structured pattern libraries (3×3 and 7×7) were intentionally used as prototype systems to verify the invariant filtering mechanism and deterministic cosine matching process in a transparent and reproducible manner.
Since invariant filtering reduces the candidate set prior to similarity evaluation, the effective computational cost grows significantly slower than the library size, which supports scalability for larger structured pattern libraries.
6. Conclusion
This work introduced a deterministic object recognition framework based on structural compatibility and finite ordering rather than iterative optimization.
The identification stage is governed by the structural decision operator
k̂= arg maxₖsₖ,
which selects the maximal element of a finite compatibility spectrum.
The decision problem is well-posed when:
1) The compatibility spectrum is finite (existence)
2) A strict dominance condition holds (uniqueness)
3) A positive spectral gap exists (stability)
Stability is therefore encoded in structural separation rather than in convergence dynamics. The spectral compatibility gap plays a role analogous to curvature in optimization-based systems, but without requiring descent trajectories or iterative refinement.
Experimental validation on structured 3×3 and 7×7 libraries confirms:
1) Correct matrix-to-vector representation
2) Deterministic cosine-based matching
3) Rotation handling without angular sweep
4) Candidate reduction via invariant filtering
5) Fully reproducible identification
The framework replaces iterative energy expenditure with structural preconditioning and finite comparison, completing recognition in a single evaluation cycle.
In contrast to training-based approaches, the solution is not approached asymptotically. It is identified directly through algebraic ordering.
The presented methodology demonstrates that small-scale structured recognition can be formulated as a well-posed deterministic decision problem governed by compatibility geometry rather than optimization dynamics.
This work does not seek to replace optimization-based learning systems. Instead, it demonstrates that for structured small-scale recognition problems, deterministic compatibility and finite ordering may suffice without iterative training.
Beyond its experimental validation, the framework clarifies a conceptual point: recognition need not be formulated as trajectory minimization in parameter space. When the object set is finite and structurally separated, identification reduces to a compatibility ordering problem over a discrete spectrum. The operator
k̂ = arg maxₖ sₖ
acts as a finite decision selector rather than as the endpoint of an optimization path. Stability is ensured by the spectral compatibility gap Δ = s₍₁₎ − s₍₂₎, where s₍₁₎ and s₍₂₎ denote the largest and second-largest compatibility values. A strictly positive Δ guarantees robustness under bounded perturbations without requiring iterative correction. In this sense, determination replaces convergence, and ordering replaces descent. The results demonstrate that, for structured small-scale libraries, recognition can be achieved through algebraic separation and invariant filtering rather than through iterative energy minimization.
For structured, low-dimensional libraries, deterministic geometric reasoning combined with invariant filtering provides a computationally controlled alternative to optimization-based methods.
The presented framework also clarifies a conceptual distinction between deterministic recognition and optimization-based learning systems. In the proposed formulation, identification is not obtained through a trajectory of parameter updates but through a finite compatibility ordering over a discrete object library. Once the representation is constructed and the invariants are computed, recognition reduces to selecting the maximal compatibility element in a finite set. This structural interpretation emphasizes that, in controlled environments with finite pattern libraries, object recognition can be formulated as a well-posed deterministic decision problem rather than as an iterative optimization process.
7. Limitations and Future Work
Despite its structural clarity and deterministic guarantees, the present framework operates under controlled assumptions that bound its applicability.
First, the experiments were conducted on structured binary patterns with moderate dimensionality (3×3 and 7×7). While the algebraic formulation extends naturally to higher dimensions, computational scaling and spectral crowding effects require further investigation.
Second, rotation handling relied on principal-axis estimation derived from second-order moments:
θ = (½) arctan(2σxy/ (σxx− σyy)).
This approach is effective for moderate rotations and non-degenerate covariance structures. However, highly symmetric patterns may produce orientation ambiguity, necessitating secondary disambiguation mechanisms.
Third, robustness was evaluated under bounded perturbations and limited resampling noise. Large geometric distortions, severe occlusion, or non-binary intensity variations were not considered in the present study.
Fourth, the framework assumes a finite, explicitly stored object library. The method does not address continuous class generalization or large-scale statistical learning scenarios where model abstraction replaces explicit storage.
Future work may explore:
1) Extension to multi-level (non-binary) intensity patterns
2) Integration of deterministic partition hierarchies for larger libraries
3) Spectral gap analysis under structured noise models
4) Hybrid architectures combining invariant filtering with learned embeddings
5) Formal perturbation bounds expressed in terms of Δ and ‖δb‖
A theoretical investigation of spectral separation conditions ensuring uniqueness,
Δ > 0,(82)
in higher-dimensional structured spaces would further strengthen the deterministic foundation of the method.
Abbreviations

Cₖ

Candidate Set at Stage k

CNN

Convolutional Neural Network

DB

Database

Estimated Class Index

k*

True Class Index

ℓ₁

ℓ₁ Norm (Mass / Zeroth Moment Content)

ℓ₂

Euclidean Norm

PCA

Principal Component Analysis

sₖ

Cosine Similarity Score

UID

Unique Identifier

Δ

Spectral Compatibility Gap

Σ

Covariance Matrix

Author Contributions
Huseyin Murat Cekirge: Conceptualization, Formal Analysis, Investigation, Methodology, Software, Validation, Visualization, Writing – original draft, Writing – review & editing
Conflicts of Interest
The author declares no conflicts of interest.
References
[1] A. K. Jain, R. P. W. Duin, and J. Mao, “Statistical pattern recognition: A review,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 1, pp. 4–37, 2000.
[2] M. K. Hu, “Visual pattern recognition by moment invariants,” IRE Transactions on Information Theory, vol. 8, no. 2, pp. 179–187, 1962.
[3] R. O. Duda and P. E. Hart, Pattern Classification and Scene Analysis. New York, NY, USA: Wiley, 1973.
[4] R. C. Gonzalez and R. E. Woods, Digital Image Processing, 4th ed. Upper Saddle River, NJ, USA: Pearson, 2018.
[5] W. K. Pratt, Digital Image Processing: PIKS Scientific Inside, 4th ed. Hoboken, NJ, USA: Wiley, 2007.
[6] M. Sonka, V. Hlavac, and R. Boyle, Image Processing, Analysis, and Machine Vision, 4th ed. Stamford, CT, USA: Cengage Learning, 2014.
[7] B. K. P. Horn, Robot Vision. Cambridge, MA, USA: MIT Press, 1986.
[8] J. P. Lewis, “Fast template matching,” in Vision Interface 1995, pp. 120–123, 1995.
[9] C. M. Bishop, Pattern Recognition and Machine Learning. New York, NY, USA: Springer, 2006.
[10] R. Szeliski, Computer Vision: Algorithms and Applications. London, U. K.: Springer, 2010.
[11] M. R. Teague, “Image analysis via the general theory of moments,” Journal of the Optical Society of America, vol. 70, no. 8, pp. 920–930, 1980.
[12] J. Flusser, T. Suk, and B. Zitová, Moments and Moment Invariants in Pattern Recognition. Hoboken, NJ, USA: Wiley, 2009.
[13] G. Strang, Introduction to Linear Algebra, 5th ed. Wellesley, MA, USA: Wellesley-Cambridge Press, 2016.
[14] G. H. Golub and C. F. Van Loan, Matrix Computations, 4th ed. Baltimore, MD, USA: Johns Hopkins University Press, 2013.
[15] N. J. Higham, Accuracy and Stability of Numerical Algorithms, 2nd ed. Philadelphia, PA, USA: SIAM, 2002.
[16] T. M. Cover and J. A. Thomas, Elements of Information Theory, 2nd ed. Hoboken, NJ, USA: Wiley, 2006.
[17] H. M. Cekirge, Equilibrium-based Deterministic Learning in AI via σ-Regularization, American Journal of Artificial Intelligence, Vol. 10, No. 1, pp. 48–60, 2026.
Cite This Article
  • APA Style

    Cekirge, H. M. (2026). An Efficient and Deterministic Object Recognition System. American Journal of Artificial Intelligence, 10(1), 148-157. https://doi.org/10.11648/j.ajai.20261001.23

    Copy | Download

    ACS Style

    Cekirge, H. M. An Efficient and Deterministic Object Recognition System. Am. J. Artif. Intell. 2026, 10(1), 148-157. doi: 10.11648/j.ajai.20261001.23

    Copy | Download

    AMA Style

    Cekirge HM. An Efficient and Deterministic Object Recognition System. Am J Artif Intell. 2026;10(1):148-157. doi: 10.11648/j.ajai.20261001.23

    Copy | Download

  • @article{10.11648/j.ajai.20261001.23,
      author = {Huseyin Murat Cekirge},
      title = {An Efficient and Deterministic Object Recognition System},
      journal = {American Journal of Artificial Intelligence},
      volume = {10},
      number = {1},
      pages = {148-157},
      doi = {10.11648/j.ajai.20261001.23},
      url = {https://doi.org/10.11648/j.ajai.20261001.23},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajai.20261001.23},
      abstract = {This paper presents an efficient deterministic framework for small-scale object recognition based on geometric normalization and invariant-driven elimination. The proposed method avoids training procedures and iterative optimization, operating instead through structured preprocessing, scalar invariant filtering, and deterministic vector comparison within a compact single-table vector database. Each object is represented as a feature vector derived from a fixed 7×7 grid and associated geometric descriptors. Recognition proceeds through a three-stage pipeline: (i) centroid-based translation and scale normalization, (ii) invariant-based candidate reduction using mass and second-order spatial moments, and (iii) localized orientation refinement followed by cosine similarity matching. Invariant-driven preselection significantly reduces the candidate set prior to fine comparison, improving computational efficiency while avoiding exhaustive rotational search. The method is lightweight, reproducible, and free of stochastic components. Experimental validation on a 100-object prototype library demonstrates stable and accurate identification under scale variation and controlled rotational perturbations. The results indicate that for structured, low-dimensional pattern libraries, deterministic geometric reasoning combined with finite database ordering provides a practical and computationally controlled recognition mechanism. The proposed framework is designed for structured and finite pattern libraries where objects can be represented through compact grid-based descriptors. In such environments, deterministic geometric preprocessing and invariant filtering significantly reduce the search space before final similarity evaluation. This allows recognition to be completed in a single deterministic evaluation cycle without iterative optimization or training procedures. The approach therefore provides a computationally lightweight alternative for embedded systems, symbolic recognition tasks, and controlled pattern libraries where structural compatibility can be exploited effectively.},
     year = {2026}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - An Efficient and Deterministic Object Recognition System
    AU  - Huseyin Murat Cekirge
    Y1  - 2026/03/23
    PY  - 2026
    N1  - https://doi.org/10.11648/j.ajai.20261001.23
    DO  - 10.11648/j.ajai.20261001.23
    T2  - American Journal of Artificial Intelligence
    JF  - American Journal of Artificial Intelligence
    JO  - American Journal of Artificial Intelligence
    SP  - 148
    EP  - 157
    PB  - Science Publishing Group
    SN  - 2639-9733
    UR  - https://doi.org/10.11648/j.ajai.20261001.23
    AB  - This paper presents an efficient deterministic framework for small-scale object recognition based on geometric normalization and invariant-driven elimination. The proposed method avoids training procedures and iterative optimization, operating instead through structured preprocessing, scalar invariant filtering, and deterministic vector comparison within a compact single-table vector database. Each object is represented as a feature vector derived from a fixed 7×7 grid and associated geometric descriptors. Recognition proceeds through a three-stage pipeline: (i) centroid-based translation and scale normalization, (ii) invariant-based candidate reduction using mass and second-order spatial moments, and (iii) localized orientation refinement followed by cosine similarity matching. Invariant-driven preselection significantly reduces the candidate set prior to fine comparison, improving computational efficiency while avoiding exhaustive rotational search. The method is lightweight, reproducible, and free of stochastic components. Experimental validation on a 100-object prototype library demonstrates stable and accurate identification under scale variation and controlled rotational perturbations. The results indicate that for structured, low-dimensional pattern libraries, deterministic geometric reasoning combined with finite database ordering provides a practical and computationally controlled recognition mechanism. The proposed framework is designed for structured and finite pattern libraries where objects can be represented through compact grid-based descriptors. In such environments, deterministic geometric preprocessing and invariant filtering significantly reduce the search space before final similarity evaluation. This allows recognition to be completed in a single deterministic evaluation cycle without iterative optimization or training procedures. The approach therefore provides a computationally lightweight alternative for embedded systems, symbolic recognition tasks, and controlled pattern libraries where structural compatibility can be exploited effectively.
    VL  - 10
    IS  - 1
    ER  - 

    Copy | Download

Author Information
  • Abstract
  • Keywords
  • Document Sections

    1. 1. Introduction
    2. 2. Related Work
    3. 3. Methodology
    4. 4. Numerical Example (3×3 Prototype)
    5. 5. 7×7 Structured Pattern Library and Experimental Evaluation
    6. 6. Conclusion
    7. 7. Limitations and Future Work
    Show Full Outline
  • Abbreviations
  • Author Contributions
  • Conflicts of Interest
  • References
  • Cite This Article
  • Author Information