---
title: A Dynamic Survey of Soft Set Theory  and Its Extensions
tags: 
author: [Testing](https://docswell.com/user/3200568)
site: [Docswell](https://www.docswell.com/)
thumbnail: https://bcdn.docswell.com/page/9J2941YWER.jpg?width=480
description: A Dynamic Survey of Soft Set Theory  and Its Extensions
published: April 09, 26
canonical: https://docswell.com/s/3200568/57NR9D-2026-04-09-005259
---
# Page. 1

![Page Image](https://bcdn.docswell.com/page/9J2941YWER.jpg)

TAKAAKI FUJITA
FLORENTIN SMARANDACHE
of
and
NSIA
NEUTROSOPHIC SCIENCE
INTERNATIONAL ASSOCIATION
PUBLISHING HOUSE


# Page. 2

![Page Image](https://bcdn.docswell.com/page/DEY4MZG9JM.jpg)

Takaaki Fujita, Florentin Smarandache
A Dynamic Survey of Soft Set Theory
and Its Extensions
Neutrosophic Science International Association (NSIA)
Publishing House
Gallup - Guayaquil
United States of America – Ecuador
2026


# Page. 3

![Page Image](https://bcdn.docswell.com/page/VJNYW3GD78.jpg)

Editor:
Neutrosophic Science International Association (NSIA)
Publishing House
https://fs.unm.edu/NSIA/
Division of Mathematics and Sciences
University of New Mexico
705 Gurley Ave., Gallup Campus
NM 87301, United States of America
University of Guayaquil
Av. Kennedy and Av. Delta
“Dr. Salvador Allende” University Campus
Guayaquil 090514, Ecuador


# Page. 4

![Page Image](https://bcdn.docswell.com/page/YE9PX9W8J3.jpg)

Table of Contents
1 Introduction
1.1 Soft Set Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1.2 Our Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5
5
5
2 Types of Soft Set
2.1 Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.2 HyperSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.3 SuperHyperSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.4 (m, n)-SuperHyperSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.5 TreeSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.6 ForestSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.7 IndetermSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.8 ContraSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.9 HesiSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.10 MultiPolar Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.11 Dynamic Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.12 Type-n Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.13 L-Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.14 PosetSoft set (monotone soft set) . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.15 Random soft set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.16 Capacitary soft set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.17 CoverSoft set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.18 FiltrationSoft set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.19 T -valued soft set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.20 Cubic Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.21 Probabilistic Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.22 D-soft set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.23 Complex Soft Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.24 Real Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.25 Intersectional soft sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.26 N -soft Sets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.27 n-ary soft set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.28 Linguistic Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.29 MetaSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7
7
7
8
10
12
14
16
17
19
20
22
23
25
26
28
30
32
33
36
39
40
41
43
45
46
47
48
49
50
3


# Page. 5

![Page Image](https://bcdn.docswell.com/page/GE8D29NZED.jpg)

2.30 Double-framed Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.31 Bijective Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.32 Ranked Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.33 Refined Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.34 MultiSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.35 GraphicSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.36 CycleSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.37 ClusterSoft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.38 Soft Expert Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.39 Soft Rough Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.40 Weighted Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.41 Other Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
52
54
55
56
57
58
61
62
64
65
66
67
3 Uncertain Soft Theory
69
3.1 Fuzzy Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
3.2 Intuitionistic Fuzzy Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
3.3 Neutrosophic Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
3.4 Plithogenic Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
3.5 Uncertain Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
3.6 Z-Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
3.7 Functorial Soft Set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
4 Applications of Soft Set
79
4.1 Soft Graph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
4.2 Soft Topological Space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
4.3 Soft Algebra . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
4.4 Soft Lattice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
4.5 Soft Vector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
4.6 Soft functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
4.7 Soft groups . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
4.8 Soft Field . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
4.9 Soft Ring . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
4.10 Soft Matroid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
4.11 Soft Bitopological Space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
4.12 Soft Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
4.13 Soft Metric Space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
4.14 Soft probabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
4.15 Soft SemiGroup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
4.16 Soft HyperStructure and SuperHyperStructure . . . . . . . . . . . . . . . . . . . 98
4.17 Soft Graph Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
4.18 HyperSoft Graph Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . 101
4.19 Soft Natural Languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
4.20 Soft n-SuperHyperGraphs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
4.21 Recursive Soft SuperHyperGraph . . . . . . . . . . . . . . . . . . . . . . . . . . 105
4.22 Hierarchical Soft SuperHyperGraph . . . . . . . . . . . . . . . . . . . . . . . . . 107
5 Soft Decision-Making
111
5.1 Soft decision-making . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
5.2 HyperSoft TOPSIS and SuperHyperSoft TOPSIS . . . . . . . . . . . . . . . . . . 113
5.3 Soft, HyperSoft, and SuperHyperSoft AHP . . . . . . . . . . . . . . . . . . . . . 116
5.4 Soft, HyperSoft, and SuperHyperSoft VIKOR . . . . . . . . . . . . . . . . . . . . 119
6 Conclusion
123
Appendix (List of Tables)
127
4


# Page. 6

![Page Image](https://bcdn.docswell.com/page/LELM2WP17R.jpg)

Chapter 1
Introduction
1.1 Soft Set Theory
Classical (crisp) set theory provides a precise and widely used language for formal reasoning
and mathematical modeling [3]. Over the past decades, many generalized set frameworks have
been introduced to represent uncertainty and vagueness, including fuzzy sets [4], intuitionistic
fuzzy sets [5], hesitant fuzzy sets [6], picture fuzzy sets [7], single-valued neutrosophic sets [8, 9],
quadripartitioned neutrosophic sets [10], pentapartitioned neutrosophic sets [11], double-valued
neutrosophic sets [12], hesitant neutrosophic sets [13], plithogenic sets [14,15], and soft sets [2,16].
A fuzzy set assigns to each element x a single membership grade µ(x) ∈ [0, 1], thereby capturing
gradual inclusion rather than a sharp yes/no decision [4,17]. Neutrosophic sets extend this viewpoint by associating three (generally independent) degrees T (x), I(x), F (x) ∈ [0, 1], interpreted
as truth, indeterminacy, and falsity, respectively [8,18]. Because these models encode uncertainty
more flexibly than crisp sets, they have been applied widely, for example in decision-making [19],
robotics and system integration [20], artificial intelligence [21], and neural networks [22, 23].
A soft set offers a direct framework for parameterized decision modeling by associating each
attribute (or parameter) with a subset of a universe, thereby handling uncertainty in a structured
manner [1, 2]. Like fuzzy and neutrosophic frameworks, soft set theory has developed many
extensions and variants, and its applications have been studied across a wide range of areas,
including decision support and related fields.
1.2 Our Contributions
In light of these developments, research on soft set theory remains important. Moreover, because
a large number of papers on soft sets and their extensions continue to appear, survey-style works
play an increasingly valuable role in organizing and clarifying the landscape. Motivated by
this need, in this book we provide a survey-style overview of soft set theory and its major
developments.
5


# Page. 7

![Page Image](https://bcdn.docswell.com/page/4JMY8945JW.jpg)

A Dynamic Survey of Soft Set Theory and Its Extensions
Takaaki Fujita 1 ∗ and Florentin Smarandache2
1
Independent Researcher, Tokyo, Japan.
Email: Takaaki.fujita060@gmail.com
2
University of New Mexico, Gallup Campus, NM 87301, USA.
Email: fsmarandache@gmail.com
Abstract
Soft set theory provides a direct framework for parameterized decision modeling by assigning to
each attribute (parameter) a subset of a given universe, thereby representing uncertainty in a
structured way [1, 2]. Over the past decades, the theory has expanded into numerous variants—
including hypersoft sets, superhypersoft sets, TreeSoft sets, bipolar soft sets, and dynamic soft
sets—and has been connected to diverse areas such as topology and matroid theory. In this
book, we present a survey-style overview of soft sets and their major extensions, highlighting
core definitions, representative constructions, and key directions of current development.
Keywords: Soft Set, HyperSoft Set, SuperHyperSoft Set, Soft Theory


# Page. 8

![Page Image](https://bcdn.docswell.com/page/PJR95GWZ79.jpg)

Chapter 2
Types of Soft Set
As types of soft sets, a wide variety of extended soft-set models have been proposed. In this
chapter, we provide a survey-style introduction and brief discussion of these extensions.
2.1 Soft Set
A Soft Set offers a straightforward approach to parameterized decision modeling by associating
attributes (or parameters) with subsets of a universal set, effectively addressing uncertainty in
a structured manner [1, 2].
Definition 2.1.1 (Soft Set). [1, 2] Let U be a universal set and A be a set of attributes. A soft
set over U is a pair (F, S), where S ⊆ A and F : S → P(U ). Here, P(U ) denotes the power
set of U . Mathematically, a soft set is represented as:
(F, S) = {(α, F(α)) | α ∈ S, F(α) ∈ P(U )}.
Each α ∈ S is called a parameter, and F(α) is the set of elements in U associated with α.
2.2 HyperSoft Set
A HyperSoft set maps each multi-attribute value tuple to a subset of the universe, capturing
combined parameter interactions [24–26].
Definition 2.2.1 (Hypersoft Set). [26] Let U be a universal set, and let A1 , A2 , . . . , Am be
attribute domains. Define C = A1 × A2 × · · · × Am , the Cartesian product of these domains. A
hypersoft set over U is a pair (G, C), where G : C → P(U ). The hypersoft set is expressed as:
(G, C) = {(γ, G(γ)) | γ ∈ C, G(γ) ∈ P(U )}.
For an m-tuple γ = (γ1 , γ2 , . . . , γm ) ∈ C , where γi ∈ Ai for i = 1, 2, . . . , m, G(γ) represents the
subset of U corresponding to the combination of attribute values γ1 , γ2 , . . . , γm .
7


# Page. 9

![Page Image](https://bcdn.docswell.com/page/PEXQKXP1JX.jpg)

Chapter 2. Types of Soft Set
Example 2.2.2 (Example of a HyperSoft Set: laptop recommendation by exact attribute tuples). Let U be a finite set of laptop models:
U = {`1 , `2 , `3 , `4 , `5 },
where `1 = Model A, `2 = Model B, `3 = Model C, `4 = Model D, `5 = Model E.
Consider m = 3 attribute domains:
A1 = {Low, Mid, High} (price tier),
A2 = {Light, Standard} (weight class),
A3 = {Long, Normal} (battery life).
Set the hypersoft parameter domain
C = A1 × A 2 × A 3 .
Define a mapping G : C → P(U ) by assigning, to each attribute tuple γ = (γ1 , γ2 , γ3 ) ∈ C , the
subset G(γ) ⊆ U of laptops matching that exact combination. For instance, suppose the models
have the following tags:
weight
battery
model price
`1
Low Standard Normal
`2
Mid
Light
Long
`3
Mid Standard Long
`4
High
Light
Long
`5
High Standard Normal
As a concrete evaluation, take the tuple
γ ∗ = (High, Light, Long) ∈ C.
Then
G(γ ∗ ) = {`4 }.
Similarly,
G(Mid, Standard, Long) = {`3 },
G(Mid, Light, Long) = {`2 }.
Thus (G, C) is a HyperSoft Set over U : each parameter is a tuple of attribute values, and G(γ)
returns the subset of objects in U that satisfy exactly that combined tuple.
2.3
SuperHyperSoft Set
A SuperHyperSoft set maps tuples of subsets of attribute-value sets to universe subsets, modeling
set-valued multi-attribute constraints [27–30].
8


# Page. 10

![Page Image](https://bcdn.docswell.com/page/3EK95WDMED.jpg)

Chapter 2. Types of Soft Set
Definition 2.3.1 (SuperHyperSoft Set). [30] Let U be a universal set, and let P(U ) denote the
power set of U . Consider n distinct attributes a1 , a2 , . . . , an , where n ≥ 1. Each attribute ai is
associated with a set of attribute values Ai , satisfying the property Ai ∩ Aj = ∅ for all i =
6 j.
Define P(Ai ) as the power set of Ai for each i = 1, 2, . . . , n. Then, the Cartesian product of the
power sets of attribute values is given by:
C = P(A1 ) × P(A2 ) × · · · × P(An ).
A SuperHyperSoft Set over U is a pair (F, C), where:
F : C → P(U ),
and F maps each element (α1 , α2 , . . . , αn ) ∈ C (with αi ∈ P(Ai )) to a subset F (α1 , α2 , . . . , αn ) ⊆
U . Mathematically, the SuperHyperSoft Set is represented as:
(F, C) = {(γ, F (γ)) | γ ∈ C, F (γ) ∈ P(U )}.
Here, γ = (α1 , α2 , . . . , αn ) ∈ C , where αi ∈ P(Ai ) for i = 1, 2, . . . , n, and F (γ) corresponds to
the subset of U defined by the combined attribute values α1 , α2 , . . . , αn .
Example 2.3.2 (Example of a SuperHyperSoft Set: meal planning with set-valued attribute
choices). Let U be a set of dinner recipes:
U = {r1 , r2 , r3 , r4 , r5 , r6 },
where r1 = tofu salad, r2 = chicken stir-fry, r3 = lentil soup, r4 = salmon bowl, r5 = gluten-free
pasta, r6 = vegetable curry.
Consider n = 3 distinct attributes:
a1 = Diet type,
a2 = Main protein,
a3 = Cooking time.
Let the corresponding attribute-value sets be
A1 = {Vegan, Omnivore, Pescatarian},
A2 = {Tofu, Chicken, Fish, Legumes},
A3 = {Quick, Medium, Long},
so Ai ∩ Aj = ∅ for i =
6 j . Define the super-parameter domain
C = P(A1 ) × P(A2 ) × P(A3 ).
For each γ = (α1 , α2 , α3 ) ∈ C , interpret αi ⊆ Ai as a set of acceptable values for attribute ai
(rather than a single value). Define F : C → P(U ) by selecting recipes compatible with the
acceptable sets. For instance, suppose the recipe tags are:
recipe
diet
protein
time
r1
Vegan
Tofu
Quick
r2
Omnivore Chicken Medium
r3
Vegan
Legumes
Long
r4
Pescatarian
Fish
Quick
r5
Omnivore Legumes Medium
r6
Vegan
Legumes Medium
9


# Page. 11

![Page Image](https://bcdn.docswell.com/page/L73WK1R275.jpg)

Chapter 2. Types of Soft Set
As a concrete evaluation, take
γ ∗ = (α1 , α2 , α3 ) = ({Vegan, Pescatarian}, {Tofu, Fish}, {Quick}).
Then F (γ ∗ ) is the subset of recipes whose diet is in α1 , protein is in α2 , and cooking time is in
α3 :
F (γ ∗ ) = {r1 , r4 }.
Thus (F, C) is a SuperHyperSoft Set: each parameter is a tuple of subsets (α1 , α2 , α3 ) specifying
acceptable attribute values at each coordinate, and F (α1 , α2 , α3 ) returns the recipes satisfying
the combined set-valued constraints.
A comparison of soft sets, hypersoft sets, and superhypersoft sets is presented in Table 2.1.
Table 2.1: Concise comparison of Soft sets, HyperSoft sets, and SuperHyperSoft sets.
Aspect
Soft set
HyperSoft set (Hypersoft set)
SuperHyperSoft set
Parameter domain
A subset A ⊆ E of parameters.
A Cartesian product
C = A1 × · · · × Am
of attribute-value domains.
A product of powersets
C = P(A1 ) × · · · ×
P(An ) (subset-valued
attribute choices).
Evaluation map (codomain)
F : A → P(U ).
G : C → P(U ).
F : C → P(U ) with C
as above.
Meaning of one parameter
A single attribute/criterion e ∈ A selects a
subset F (e) ⊆ U .
A full attribute-value
tuple γ ∈ C selects
G(γ) ⊆ U .
A tuple of value-subsets
(α1 , . . . , αn ) ∈ C selects F (α1 , . . . , αn ) ⊆
U.
Typical use
Parameterized
selection under independent
criteria.
Multi-attribute selection under simultaneous value assignments.
Multi-attribute selection under set-valued
(possibly multi-choice)
constraints per attribute.
2.4
(m, n)-SuperHyperSoft Set
An (m, n)-SuperHyperSoft set parameterizes objects by m attribute groups across n hierarchical
subset-levels, mapping each tuple to a universe subset. Let U be a nonempty universe, and let
A1 , A2 , . . . , Am
be m pairwise‑disjoint attribute domains. We write P(Ai ) for the power set of Ai . Introduce
C=
m
Y
P(Ai ) = P(A1 ) × · · · × P(Am ),
i=1
whose elements are tuples α = (α1 , . . . , αm ) with αi ⊆ Ai .
10


# Page. 12

![Page Image](https://bcdn.docswell.com/page/87DK3XY6JG.jpg)

Chapter 2. Types of Soft Set
Similarly, fix a single universal codomain U and consider
D=
n
Y
j=1
P(U ) = P(U ) × · · · × P(U ) .
{z
}
|
n factors
An element of D is an n-tuple X = (X1 , . . . , Xn ) with each Xj ⊆ U .
Definition 2.4.1 ((m, n)‑SuperHyperSoft Set). An (m, n)-SuperHyperSoft Set on U (with attribute domains A1 , . . . , Am ) is a function
F : C −→ D.
Equivalently, one may write

F (α1 , . . . , αm ) = F1 (α1 , . . . , αm ), . . . , Fn (α1 , . . . , αm ) ,
where each coordinate
Fj : C −→ P(U ) (j = 1, . . . , n)
is itself a “classical” m‑SuperHyperSoft Set1 .
Remark 2.4.2. Thus an (m, n)-SuperHyperSoft Set encodes n different m‑SuperHyperSoft
evaluations in parallel, one per coordinate.
Example 2.4.3 (Example of an (m, n)-SuperHyperSoft Set: course recommendation with twolevel outputs). Let U be a set of university courses:
U = {c1 , c2 , c3 , c4 , c5 , c6 },
where c1 = Linear Algebra, c2 = Discrete Mathematics, c3 = Machine Learning, c4 = Databases,
c5 = Algorithms, c6 = Statistics.
Take m = 3 pairwise-disjoint attribute domains describing a student’s preferences:
A1 = {Math, CS, Data} (interest area),
A2 = {Beginner, Intermediate, Advanced} (difficulty tolerance),
A3 = {Short, Normal} (weekly workload).
Define the input domain
C = P(A1 ) × P(A2 ) × P(A3 ).
Fix n = 2 output levels and define
D = P(U ) × P(U ).
Interpret the first component as recommended courses and the second as optional courses.
Define F : C → D by

F (α1 , α2 , α3 ) = F1 (α1 , α2 , α3 ), F2 (α1 , α2 , α3 ) ,
1
I.e. a mapping from C into P(U ).
11


# Page. 13

![Page Image](https://bcdn.docswell.com/page/VJPK4P2ZE8.jpg)

Chapter 2. Types of Soft Set
where F1 , F2 : C → P(U ) are given by a simple rule-based advisor.
For a concrete parameter tuple, take
(α1 , α2 , α3 ) = ({CS, Data}, {Intermediate, Advanced}, {Normal}).
Suppose the advisor outputs
F1 (α1 , α2 , α3 ) = {c3 , c5 , c4 } and F2 (α1 , α2 , α3 ) = {c6 , c2 }.
Thus

F (α1 , α2 , α3 ) = {c3 , c5 , c4 }, {c6 , c2 } ∈ D.
Interpretation: the input (α1 , α2 , α3 ) specifies sets of acceptable values for each attribute group
(interest area, difficulty, workload), and the output is an n-tuple of subsets of U (recommended
and optional course lists). Hence F is an (m, n)-SuperHyperSoft Set on U with (m, n) = (3, 2).
For reference, the comparison between a SuperHyperSoft set and an (m, n)-SuperHyperSoft set
is summarized in Table 2.2.
Table 2.2: Concise comparison between a SuperHyperSoft set and an (m, n)-SuperHyperSoft set
on a universe U .
Aspect
SuperHyperSoft
output)
Attribute domains
Input (parameter) space
Pairwise-disjoint
domains
A1 , . . . , Am ; each input component is a subset αi ⊆ Ai .
Q
C= m
i=1 P(Ai ).
Mapping (codomain)
F : C → P(U ).
Output semantics
For each α ∈ C , a single selected subset F (α) ⊆ U .
For each α ∈ C , an n-tuple F (α) =
(F1 (α), . . . , Fn (α)) with Fj (α) ⊆ U
(e.g., recommended/optional/rejected
tiers).
Equivalent viewpoint
One m-attribute, subset-valued selector on U .
An ordered family of n parallel selectors (F1 , . . . , Fn ), each an
m-SuperHyperSoft-type map C →
P(U ).
Reduction / relation
Special case of (m, n) with n = 1
(identify D = P(U )).
Strict extension of the single-output
model by allowing n coordinated outputs for each input tuple.
Typical use
Set-valued
multi-attribute
constraints/selection with set-valued
attribute inputs.
Multi-stage screening, multi-tier reporting, or hierarchical decision outputs under the same set-valued multiattribute inputs.
2.5
set
(single-
(m, n)-SuperHyperSoft
(multi-output)
set
Same domains A1 , . . . , Am and the
same subset-valued input components
αi ⊆ Ai .
Qm
Same C = i=1 P(Ai ).
Q
F : C → D, where D = nj=1 P(U ).
TreeSoft Set
A TreeSoft set maps subsets of a hierarchical attribute tree to universe subsets, modeling refined
parameters across multiple levels [31–34]. Related concepts include PolyTree-soft sets [35] and
Tree-to-Tree soft sets [36].
12


# Page. 14

![Page Image](https://bcdn.docswell.com/page/2EVVX2DMEQ.jpg)

Chapter 2. Types of Soft Set
Definition 2.5.1 (TreeSoft Set). [37] Let U be a universe of discourse and let H be a nonempty
subset of U . Write P(H) for the power set of H . Let A = {A1 , A2 , . . . , An } be a set of attributes
(parameters, factors, etc.), where n ≥ 1 and each Ai is regarded as a first-level attribute.
Each first-level attribute Ai may be refined into a set of second-level sub-attributes
Ai = {Ai,1 , Ai,2 , . . . }.
Likewise, each second-level sub-attribute Ai,j may be further refined into third-level sub-subattributes,
Ai,j = {Ai,j,1 , Ai,j,2 , . . . },
and so on. In general, one may consider sub-attributes at the m-th level, indexed by Ai1 ,i2 ,...,im ,
where each index ik specifies the position at level k .
This hierarchical attribute organization determines a rooted tree, denoted by Tree(A), whose
root is A (level 0) and whose nodes consist of all attributes and sub-attributes across levels 1
through m. The terminal nodes (nodes without descendants) are called the leaves of Tree(A).
A TreeSoft Set on H (with attribute-tree Tree(A)) is a mapping

F : P Tree(A) −→ P(H),
where P(Tree(A)) denotes the power set of the node set of Tree(A).
Example 2.5.2 (Example of a TreeSoft Set: medical triage rules organized by a symptom tree).
Let U be a universe of patients and let
H = {p1 , p2 , p3 , p4 , p5 , p6 } ⊆ U
be a finite set of patients currently in a clinic.
Consider a hierarchical attribute system with two first-level attributes:
A = {A1 , A2 },
A1 = “Respiratory”,
A2 = “Cardiovascular”.
Refine each first-level attribute into second-level sub-attributes:
A1 = {A1,1 , A1,2 },
A1,1 = “Cough”,
A1,2 = “Shortness of breath”,
A2 = {A2,1 , A2,2 },
A2,1 = “Chest pain”,
A2,2 = “Palpitations”.
Refine one second-level attribute further into third-level sub-sub-attributes:
A1,2 = {A1,2,1 , A1,2,2 },
A1,2,1 = “Mild dyspnea”,
A1,2,2 = “Severe dyspnea”.
Let Tree(A) denote the rooted attribute tree whose nodes are
Tree(A) = {A, A1 , A2 , A1,1 , A1,2 , A2,1 , A2,2 , A1,2,1 , A1,2,2 }.
Define a TreeSoft Set
F : P(Tree(A)) −→ P(H)
13


# Page. 15

![Page Image](https://bcdn.docswell.com/page/57GLVRYXEL.jpg)

Chapter 2. Types of Soft Set
by mapping any chosen set of nodes X ⊆ Tree(A) to the subset F (X) ⊆ H of patients who
satisfy all clinical features represented in X . Concretely, suppose the clinic records yield:
F ({A1,1 }) = {p1 , p2 , p5 } (patients with cough),
F ({A1,2,2 }) = {p2 , p6 } (patients with severe dyspnea),
F ({A2,1 }) = {p3 , p6 } (patients with chest pain).
For a combined node-set, define F by intersection of the corresponding patient groups; for
example,
F ({A1,1 , A1,2,2 }) = F ({A1,1 }) ∩ F ({A1,2,2 }) = {p2 },
and
F ({A2,1 , A1,2,2 }) = F ({A2,1 }) ∩ F ({A1,2,2 }) = {p6 }.
Thus F assigns to each subset of attribute-tree nodes a subset of patients in H matching the
selected hierarchical symptom description, and therefore (F, Tree(A)) constitutes a TreeSoft Set
on H .
2.6
ForestSoft Set
A ForestSoft Set is formed by taking a collection of TreeSoft Sets and “gluing” (uniting) them
together so as to obtain a single function whose domain is the union of all tree-nodes’ power sets
and whose values in P (H) combine the images given by the individual TreeSoft Sets [38–41].
Definition 2.6.1 (ForestSoft Set). [40] Let U be a universe of discourse, H ⊆ U be a non-empty
subset, and P (H) be the power set of H . Suppose we have a finite (or countable) collection of
TreeSoft Sets

Ft : P (Tree(A(t) )) → P (H) t∈T ,
where each Ft is a TreeSoft Set corresponding to a tree Tree(A(t) ) of attributes A(t) .
We construct a forest by taking the (disjoint) union of all these trees:
G


Forest {A(t) }t∈T =
Tree A(t) .
t∈T
A ForestSoft Set, denoted by

F : P Forest({A(t) }) −→ P (H),
is defined as the union of all TreeSoft Set mappings Ft . Concretely, for any element X ∈
P Forest({A(t) }) , we set
F(X) =

Ft X ∩ Tree(A(t) ) ,
[
t∈T
X∩ Tree(A(t) ) =
6 ∅
where we only apply Ft to that portion of X belonging to the tree Tree(A(t) ).
14


# Page. 16

![Page Image](https://bcdn.docswell.com/page/4EQY6VM5JP.jpg)

Chapter 2. Types of Soft Set
Example 2.6.2 (Example of a ForestSoft Set: hospital triage across multiple specialty trees).
Let U be a universe of patients and let
H = {p1 , p2 , p3 , p4 , p5 , p6 , p7 , p8 } ⊆ U
be the set of patients currently under assessment.
Assume two medical specialties provide separate hierarchical attribute trees (a forest):
T = {tResp , tCard }.
(1) Respiratory TreeSoft Set. Let Tree(A(tResp ) ) be the respiratory attribute tree with nodes
Tree(A(tResp ) ) = {R, RCough , RDyspnea , RSevDyspnea },
interpreted as R = Respiratory (root), RCough = Cough, RDyspnea = Dyspnea, RSevDyspnea =
Severe dyspnea. Define a TreeSoft Set
FtResp : P(Tree(A(tResp ) )) → P(H)
by patient groups:
FtResp ({RCough }) = {p1 , p2 , p5 },
FtResp ({RSevDyspnea }) = {p2 , p6 , p8 },
and (as a typical rule) for combined node-sets use intersections, e.g.,
FtResp ({RCough , RSevDyspnea }) = {p1 , p2 , p5 } ∩ {p2 , p6 , p8 } = {p2 }.
(2) Cardiovascular TreeSoft Set. Let Tree(A(tCard ) ) be the cardiovascular attribute tree with
nodes
Tree(A(tCard ) ) = {C, CChestPain , CArrhythmia },
interpreted as C = Cardiovascular (root), CChestPain = Chest pain, CArrhythmia = Arrhythmia/palpitations. Define a TreeSoft Set
FtCard : P(Tree(A(tCard ) )) → P(H)
by
FtCard ({CChestPain }) = {p3 , p6 },
and for a combined node-set,
FtCard ({CArrhythmia }) = {p4 , p7 },
FtCard ({CChestPain , CArrhythmia }) = {p3 , p6 } ∩ {p4 , p7 } = ∅.
(3) Forest and ForestSoft Set aggregation. Form the forest by disjoint union:
Forest = Tree(A(tResp ) ) t Tree(A(tCard ) ).
Define the ForestSoft Set F : P(Forest) → P(H) by
[

F(X) =
Ft X ∩ Tree(A(t) ) .
t∈T
X∩Tree(A(t) )6=∅
For example, take the mixed selection
X = {RSevDyspnea , CChestPain } ⊆ Forest.
Then
F(X) = FtResp ({RSevDyspnea }) ∪ FtCard ({CChestPain }) = {p2 , p6 , p8 }∪{p3 , p6 } = {p2 , p3 , p6 , p8 }.
Interpretation: the ForestSoft Set aggregates the (possibly different) specialty-specific TreeSoft
Set outputs, enabling a unified view across multiple hierarchical symptom trees.
15


# Page. 17

![Page Image](https://bcdn.docswell.com/page/KJ4W4MVV71.jpg)

Chapter 2. Types of Soft Set
2.7
IndetermSoft Set
Single-valued IndetermSoft Set maps each attribute value to one subset capturing undirected,
non-unique indeterminacy over H; domain/codomain may be indeterminate [42–46].
Definition 2.7.1 ((Single-valued) IndetermSoft set). [24, 37, 47, 48] Let U be a universe of
discourse, H ⊆ U a non-empty subset, and P (H) the powerset of H . Let A be the set of
attribute values for an attribute a. A function F : A → P (H) is called an IndetermSoft Set if
at least one of the following conditions holds:
1. A has some indeterminacy.
2. P (H) has some indeterminacy.
3. There exists at least one v ∈ A such that F (v) is indeterminate (unclear, uncertain, or not
unique).
4. Any two or all three of the above conditions.
An IndetermSoft Set is represented mathematically as:
F : A → H(∩, ∪, ⊕, ¬),
where H(∩, ∪, ⊕, ¬) represents a structure closed under the IndetermSoft operators.
Example 2.7.2 (Example of a (single-valued) IndetermSoft set: recruiting with missing/uncertain evidence). Let U be the set of shortlisted applicants for a data-engineering position:
U = {u1 , u2 , u3 , u4 , u5 },
H := U.
Consider one attribute a = “technical screening outcome” with a value-set
A = {Pass, Borderline, Fail}.
Define F : A → P(H) as follows. Suppose the company has completed the screening, but two
applicants (u2 , u5 ) have indeterminate results due to missing logs and a disputed proctoring
report. Thus, the subsets corresponding to clear outcomes are:
F (Pass) = {u1 , u3 },
F (Fail) = {u4 },
while the “Borderline” group is not uniquely determined: depending on which audit is accepted,
either u2 is borderline and u5 is cleared, or vice versa. Hence we treat
F (Borderline) = indeterminate (not unique).
One convenient single-valued representation is to regard F (Borderline) as taking values in a
family of possible subsets (an indeterminate value), e.g.,

F (Borderline) ∈ {u2 }, {u5 } .
Interpretation: the attribute-value set A is crisp, the universe H is crisp, but at least one value
F (v) (here v = Borderline) is indeterminate/unclear/not unique. Therefore F satisfies Condition (3) in Definition (Single-valued) IndetermSoft set, and so F constitutes an IndetermSoft set
on H .
16


# Page. 18

![Page Image](https://bcdn.docswell.com/page/LE1Y48347G.jpg)

Chapter 2. Types of Soft Set
Related concepts include the following notions.
• IndetermHyperSoft Set [43, 49, 50]: Hypersoft set whose parameter tuples or images
may be indeterminate, nonunique, or partially specified.
• IndetermSuperHyperSoft Set [51]: Superhypersoft set allowing indeterminacy in higherorder parameter subsets and corresponding approximations.
• Bipolar IndetermSoft Set [52]: Indetermsoft set with positive and negative evaluations,
permitting indeterminacy within both perspectives.
• Weighted Indetermsoft set [44]: Weighted IndetermSoft set assigns each attribute
a weight and indeterminate approximation, enabling prioritized decision-making under
uncertainty with incomplete data often.
2.8 ContraSoft Set
A ContraSoft Set is a parameterized soft set where each parameter’s values are associated with
a contradiction degree, and thresholding is used to aggregate only those values that are not too
contradictory with respect to a chosen reference [53]. This allows soft-set modeling to filter or
weight information based on contradiction, rather than uncertainty.
Definition 2.8.1 (Contradiction on attribute values). [53] Let V be a nonempty finite set of
attribute values. A contradiction function on V is a map
c : V × V −→ [0, 1]
such that
c(v, v) = 0 (reflexivity),
c(v, w) = c(w, v) (symmetry).
The quantity c(v, w) measures the degree of contradiction between v and w (larger means more
contradictory).
Definition 2.8.2 (ContraSoft structure). Let U be a nonempty universe and E a nonempty set
of parameters. For each e ∈ E fix:
• a nonempty finite value set Ve ;
• a contradiction function ce : Ve × Ve → [0, 1] (Definition 2.8.1);
• a designated reference value ve? ∈ Ve .
Write V :=
e∈E ({e} × Ve ) for the disjoint union of all parameter–value pairs.
F
17


# Page. 19

![Page Image](https://bcdn.docswell.com/page/GEWGXZPZJ2.jpg)

Chapter 2. Types of Soft Set
Definition 2.8.3 (ContraSoft Set). Let U be a finite universe of objects and E a finite set of
parameters. A ContraSoft Set is a quadruple
CS := (U, E, F, c),
where
• F : E → P(U ) is the (crisp) soft mapping; F (e) ⊆ U is the set of objects accepted (or
classified as positive) under parameter e;
• c : E × E → [0, 1] is a contradiction degree on parameters, symmetric and reflexive on the
diagonal:
c(e, e) = 0,
c(e, f ) = c(f, e) (∀ e, f ∈ E).
For x ∈ U and e ∈ E , the atomic lemma “x is accepted by e” is represented by
x ∈ F (e),
A(x, e) :
with truth value T if x ∈ F (e) and F otherwise.
Remark 2.8.4 (Relation to classical soft sets and to “indeterminacy”). If Ve = {ve? } for all e,
then F (τ ) (e) = F (e, ve? ) and we recover the classical soft set (F ◦ , E) with F ◦ (e) = F (e, ve? ).
Thus, contradiction plays the role of the third component often used as “indeterminacy” (e.g. in
neutrosophic settings), but here it acts as a distance-to-reference that controls which value-slices
are admitted into F (τ ) (e).
Example 2.8.5 (Real-life example of a ContraSoft Set: hiring filters with contradictory criteria).
Let U be a finite set of job applicants:
U = {u1 , u2 , u3 , u4 , u5 , u6 }.
Let E be a finite set of screening parameters:
E = {eExp , eSalary , eRemote , eOnsite },
where eExp = “has strong experience”, eSalary = “fits low salary budget”, eRemote = “prefers
fully remote”, and eOnsite = “prefers on-site”.
Define the soft mapping F : E → P(U ) by the applicants accepted under each criterion:
F (eExp ) = {u1 , u3 , u5 },
F (eSalary ) = {u2 , u4 , u6 },
F (eRemote ) = {u1 , u2 , u6 },
F (eOnsite ) = {u3 , u4 , u5 }.
Now define a contradiction degree c : E ×E → [0, 1] capturing how incompatible two parameters
are. For example, “remote” and “on-site” are highly contradictory, while “experience” and “salary
budget” are moderately contradictory:
c(eRemote , eOnsite ) = c(eOnsite , eRemote ) = 0.95,
c(eExp , eSalary ) = c(eSalary , eExp ) = 0.60,
and set c(e, e) = 0 for all e ∈ E ; for all other unordered pairs not listed above, take c = 0.20.
Then CS = (U, E, F, c) is a ContraSoft Set. Interpretation: when aggregating decisions across
parameters, one may downweight or discard simultaneously using highly contradictory criteria
(e.g., combining eRemote with eOnsite ), while allowing combinations with low contradiction.
A comparison between Soft Sets and ContraSoft Sets is presented in Table 2.3.
18


# Page. 20

![Page Image](https://bcdn.docswell.com/page/47ZL61KLJ3.jpg)

Chapter 2. Types of Soft Set
Table 2.3: Soft Set vs. ContraSoft Set (concise comparison)
Aspect
Soft Set
ContraSoft Set
Core idea
Parameterized family of subsets
of a universe.
Universe/Parameters
Mapping
Universe U , parameter set E .
Extra structure
None.
Selection / aggregation
Set-theoretic filtering (union, intersection) across parameters.
Typical use
Parameter-driven modeling of uncertainty and preferences.
Reduction
—
Soft Set augmented with contradiction degrees to control acceptance/weighting.
Same U, E plus contradiction
map(s).
F : E → P(U ) together with
contradiction c on parameters
and/or values.
c : E × E → [0, 1] (and optionally
ce : Ve ×Ve → [0, 1]), reference(s),
tolerance τ .
Contradiction-aware
filtering
(τ )
F and/or weighted aggregation
(plithogenic-style).
Conflict-aware modeling when parameters/values may be mutually
opposing.
Recovers Soft Set when c ≡ 0
(and no contradiction-based filtering is applied).
F : E → P(U ).
2.9 HesiSoft Set
A HesiSoft Set is a soft set F : E → P(U ) together with a symmetric hesitancy map h :
E×E → Pfin ([0, 1]). Related concepts include hesitant fuzzy sets [6,54] and hesitant neutrosophic
sets [13, 55].
Definition 2.9.1 (HesiSoft Set). Let U be a finite universe of objects and E a finite set of
parameters. A HesiSoft Set is a quadruple
HSS := (U, E, F, h),
where
• F : E → P(U ) is the (crisp) soft mapping; F (e) ⊆ U is the set of objects accepted (or
classified as positive) under parameter e;
• h : E × E → Pfin ([0, 1]) is a hesitancy map on parameters, symmetric and normalized on
the diagonal:
h(e, e) = {0},
h(e, f ) = h(f, e) (∀ e, f ∈ E),
where Pfin ([0, 1]) denotes the family of all finite subsets of [0, 1].
For x ∈ U and e ∈ E , the atomic statement “x is accepted by e” is represented by
x ∈ F (e),
A(x, e) :
with truth value T if x ∈ F (e) and F otherwise.
19


# Page. 21

![Page Image](https://bcdn.docswell.com/page/YJ6W2LQMJV.jpg)

Chapter 2. Types of Soft Set
Example 2.9.2 (Hiring shortlisting with parameter-wise hesitancy). Let U be the set of applicants
U = {u1 , u2 , u3 , u4 , u5 },
and let E be the set of evaluation parameters
E = {Tech, Comm, Lead, Culture},
standing for technical skills, communication, leadership, and culture fit, respectively.
(1) Crisp acceptance map. Define F : E → P(U ) by
F (Tech) = {u1 , u2 , u4 },
F (Comm) = {u2 , u3 , u5 },
F (Lead) = {u1 , u3 },
F (Culture) = {u2 , u4 , u5 }.
Thus, for example, A(u4 , Tech) is true (since u4 ∈ F (Tech)), while A(u4 , Comm) is false (since
u4 ∈
/ F (Comm)).
(2) Hesitancy map on parameters. Define h : E × E → Pfin ([0, 1]) by setting h(e, e) = {0}
for all e ∈ E , and for distinct parameters specify (symmetrically):
h(Tech, Comm) = {0.2, 0.4},
h(Tech, Lead) = {0.1, 0.3},
h(Tech, Culture) = {0.4, 0.6}, h(Comm, Lead) = {0.3, 0.5},
h(Comm, Culture) = {0.1, 0.2}, h(Lead, Culture) = {0.5, 0.7},
and h(e, f ) = h(f, e) for all pairs. Here each finite set h(e, f ) records a committee hesitancy
profile about how strongly the two criteria e and f should co-influence a final hiring decision
(e.g., disagreements or multiple plausible weights coming from different interviewers).
Then
HSS = (U, E, F, h)
is a HesiSoft Set modeling a real hiring shortlist: F captures crisp accept/reject outcomes per
criterion, while h captures finite-valued hesitancy between criteria induced by mixed expert
opinions.
2.10 MultiPolar Soft Set
A multipolar soft set assigns to each parameter multiple “polar” subsets, thereby capturing
several perspectives or evaluations over the same universe. Related notions include multipolar
fuzzy sets [56] and multipolar neutrosophic sets [57–59].
Definition 2.10.1 (Multipolar Soft Set). Let U be a nonempty universe and let E be a nonempty
set of parameters. Fix an integer m ≥ 2 (the number of poles). An m-polar (multipolar) soft set
over U with respect to E is an ordered pair (F, E), where
m
F : E −→ P(U )
20


# Page. 22

![Page Image](https://bcdn.docswell.com/page/GJ5M21RQJ4.jpg)

Chapter 2. Types of Soft Set
is a mapping. For each parameter e ∈ E , we write

F (e) = F1 (e), F2 (e), . . . , Fm (e) ,
where each component satisfies Fi (e) ⊆ U for i = 1, 2, . . . , m. The subset Fi (e) is called the
i-th polar approximation of U under the parameter e, and it represents the evaluation of e from
the i-th perspective (pole). Equivalently, an m-polar soft set can be viewed as an ordered family
of ordinary soft sets (F1 , E), . . . , (Fm , E) on the same universe and the same parameter set.
Remark 2.10.2. If m = 1, then F : E → P(U ) and (F, E) reduces to an ordinary (crisp) soft
set. For m = 2, the model yields a two-pole soft representation (often studied as a bipolar-type
framework, possibly with additional constraints depending on the chosen bipolar definition).
Example 2.10.3 (Real-life example of a multipolar soft set: multi-stakeholder project risk
screening). Let U be a set of candidate IT projects in a company:
U = {p1 , p2 , p3 , p4 , p5 }.
Let E be a set of risk-related parameters and take
E = {eSec , eCost , eSched },
where eSec = “security risk”, eCost = “budget risk”, eSched = “schedule risk”.
Suppose three different stakeholder groups evaluate each risk parameter:
(1) Security team, (2) Finance team, (3) PMO.
3
Define a mapping F : E → P(U ) by letting, for each parameter e ∈ E ,

F (e) = F1 (e), F2 (e), F3 (e) ,
m = 3,
where Fi (e) ⊆ U is the set of projects judged by stakeholder i to have high risk under e.
For instance, assume the following assessments:

F (eSec ) = {p2 , p4 }, {p4 }, {p2 , p3 , p4 } ,

F (eCost ) = {p3 }, {p1 , p3 , p5 }, {p1 , p5 } ,

F (eSched ) = {p2 , p5 }, {p5 }, {p1 , p2 , p5 } .
Then (F, E) is a 3-polar (multipolar) soft set over U : for each risk parameter e, the three
components represent the “high-risk” project subsets identified from three distinct perspectives.
Interpretation: this structure supports decisions such as consensus high risk for e (intersection
T3
S3
i=1 Fi (e)), or any-stakeholder high risk (union
i=1 Fi (e)), depending on the organization’s
risk policy.
Related notions include the following concepts.
• Bipolar Soft Sets [60–62]: model parameters by paired positive/negative approximations,
enabling simultaneous support and opposition assessments for each object.
• Bipolar HyperSoft Sets [63–65]: extend bipolar soft sets to multi-attribute tuple parameters,
assigning positive and negative approximations to each tuple.
• HyperPolar Soft Sets [66]: A hyperpolar soft set assigns each parameter n polarity-based
subsets of the universe, capturing qualitative evaluations from multiple agents simultaneously.
21


# Page. 23

![Page Image](https://bcdn.docswell.com/page/9E2941DW7R.jpg)

Chapter 2. Types of Soft Set
2.11 Dynamic Soft Set
A dynamic soft set is a time-indexed family of soft sets, modeling parameterized approximations
that evolve across time or contexts [67–69].
Definition 2.11.1 (Dynamic soft set). [67–69] Let U be a nonempty universe of discourse, let
E be a (nonempty) set of parameters, and let T be a nonempty index set (e.g., time points,
system states, or contexts).
A dynamic soft set over (U, E) indexed by T is a family
S = { (t, Ft , At ) | t ∈ T },
such that for each t ∈ T :
At ⊆ E
and Ft : At −→ P(U ).
Equivalently, one may represent S as the set of triples
S = { (t, e, Ft (e)) | t ∈ T, e ∈ At },
where Ft (e) ⊆ U is the (crisp) approximation of U with respect to parameter e at index t.
For each t ∈ T , the pair (Ft , At ) is the time-/context-slice soft set of S at t.
Remark 2.11.2 (Reduction to a classical soft set). If there exists a fixed A ⊆ E and a fixed
mapping F : A → P(U ) such that At = A and Ft = F for all t ∈ T , then the dynamic soft set
S reduces to the classical (static) soft set (F, A).
Example 2.11.3 (Real-life example of a dynamic soft set: daily product availability in a grocery
store). Let U be a set of products sold by a grocery store:
U = {p1 , p2 , p3 , p4 , p5 , p6 }.
Let E be a set of availability-related parameters:
E = {eIn , eSale , eOut },
where eIn = “in stock”, eSale = “on sale”, and eOut = “out of stock”. Let the time index set be
three consecutive days,
T = {t1 , t2 , t3 }.
For each day t ∈ T , define a time-slice soft set (Ft , At ) describing the store status. Take
At = {eIn , eSale , eOut } ⊆ E for all t ∈ T .
Day t1 :
Ft1 (eIn ) = {p1 , p2 , p3 , p5 },
Ft1 (eSale ) = {p2 , p5 },
22
Ft1 (eOut ) = {p4 , p6 }.


# Page. 24

![Page Image](https://bcdn.docswell.com/page/D7Y4MZ29EM.jpg)

Chapter 2. Types of Soft Set
Day t2 :
Ft2 (eIn ) = {p1 , p3 , p4 , p5 },
Ft2 (eSale ) = {p1 , p4 },
Ft2 (eOut ) = {p2 , p6 }.
Ft3 (eIn ) = {p1 , p2 , p4 , p6 },
Ft3 (eSale ) = {p2 , p6 },
Ft3 (eOut ) = {p3 , p5 }.
Day t3 :
Define
S = {(t, Ft , At ) | t ∈ T }.
Then S is a dynamic soft set over (U, E) indexed by T : each time point t has its own soft
mapping Ft : At → P(U ) describing which products are in stock, on sale, or out of stock on
that day.
Interpretation: the same parameter (e.g., “in stock”) may select different subsets of products as
inventory changes over time, and the dynamic soft set records these time-dependent approximations.
2.12 Type-n Soft Set
A Type-n soft set iteratively parameterizes soft sets, assigning each parameter a Type-(n−1) soft
set over the universe. Related concepts include Type-2 fuzzy sets [70, 71], Type-2 neutrosophic
sets [72–74], and Type-2 soft sets [75].
Definition 2.12.1 (Type-n soft set). Let U be a nonempty universe and let E be a nonempty
set of parameters. Define, recursively, the classes Σ(n) (U, E) of Type-n soft sets over (U, E) as
follows.
(1) Type-1 (classical) soft sets.
Σ(1) (U, E) :=

A ⊆ E, F : A → P(U ) .
(F, A)
Elements of Σ(1) (U, E) are (crisp) soft sets in the sense of Molodtsov.
(2) Inductive step. For each integer n ≥ 2, define

Σ(n) (U, E) := (F, A)
A ⊆ E, F : A → Σ(n−1) (U, E) .
An element (F, A) ∈ Σ(n) (U, E) is called a Type-n soft set (briefly, Tn SS ) over (U, E). The set
A is the primary parameter set. For each a ∈ A we may write
F (a) = (Fa , Aa ) ∈ Σ(n−1) (U, E),
so that Aa ⊆ E is a (possibly a-dependent) underlying parameter set at level 2. Iterating this
decomposition, for any chain
a1 ∈ A,
a2 ∈ Aa1 ,
...,
an ∈ Aa1 ···an−1 ,
the terminal evaluation is a subset of the universe,
Fa1 ···an−1 (an ) ⊆ U.
23


# Page. 25

![Page Image](https://bcdn.docswell.com/page/VENYW38DJ8.jpg)

Chapter 2. Types of Soft Set
Remark 2.12.2. (i) For n = 1, Definition 2.12.1 reduces to the usual (crisp) soft set F : A →
P(U ).
(ii) For n = 2, F : A → Σ(1) (U, E), i.e., each primary parameter a ∈ A is assigned a Type-1
soft set; this is the usual Type-2 soft set [75–78] viewpoint (parameterization over an already
parameterized family).
Example 2.12.3 (Real-life example of a Type-3 soft set: company selection by department →
criterion → strictness). Let U be a set of job candidates:
U = {u1 , u2 , u3 , u4 , u5 , u6 }.
Let E be a pool of evaluation parameters (used at all levels), including:
E = {eAlg , eComm , eExp , eStrong , eModerate }.
Interpret eAlg = “good algorithms”, eComm = “good communication”, eExp = “relevant experience”, and eStrong , eModerate as strictness levels (meta-criteria) for acceptance.
We construct a Type-3 soft set (F, A) ∈ Σ(3) (U, E) that models the following hierarchy:
Department −→ Criterion −→ Strictness.
Level 1 (primary parameters): departments. Let
A = {aEng , aPM } ⊆ E
be the primary parameter set, where aEng = “Engineering department” and aPM = “Product
management department”. For each department a ∈ A, we define a Type-2 soft set
F (a) = (Fa , Aa ) ∈ Σ(2) (U, E).
Level 2: criteria used by each department. Let
AaEng = {eAlg , eExp } ⊆ E,
AaPM = {eComm , eExp } ⊆ E.
Thus Engineering focuses on (Alg, Exp), while PM focuses on (Comm, Exp). For each criterion
c ∈ Aa , we define a Type-1 soft set
Fa (c) = (Fa,c , Aa,c ) ∈ Σ(1) (U, E),
where Aa,c is a strictness-parameter set and Fa,c : Aa,c → P(U ) returns the candidate subset
accepted under that strictness.
Level 3: strictness parameters and final accepted subsets. For each (a, c) above, take
Aa,c = {eStrong , eModerate } ⊆ E.
Define the terminal (Type-1) acceptance mappings as follows.
24


# Page. 26

![Page Image](https://bcdn.docswell.com/page/Y79PX9K8E3.jpg)

Chapter 2. Types of Soft Set
Engineering → Algorithms:
FaEng ,eAlg (eStrong ) = {u1 , u4 },
FaEng ,eAlg (eModerate ) = {u1 , u3 , u4 , u6 }.
Engineering → Experience:
FaEng ,eExp (eStrong ) = {u2 , u4 },
FaEng ,eExp (eModerate ) = {u1 , u2 , u4 , u5 }.
PM → Communication:
FaPM ,eComm (eStrong ) = {u3 , u5 },
FaPM ,eComm (eModerate ) = {u1 , u3 , u5 , u6 }.
PM → Experience:
FaPM ,eExp (eStrong ) = {u2 , u5 },
FaPM ,eExp (eModerate ) = {u2 , u4 , u5 , u6 }.
Verification of the Type-3 structure. By construction, for each a ∈ A, the pair (Fa , Aa ) is
a Type-2 soft set because Fa : Aa → Σ(1) (U, E). Moreover, for each c ∈ Aa , the pair (Fa,c , Aa,c )
is a Type-1 soft set because Fa,c : Aa,c → P(U ). Hence (F, A) ∈ Σ(3) (U, E) is a Type-3 soft set.
Interpretation. A chain (a, c, s) with a ∈ {Eng, PM}, c a criterion used by a, and s ∈
{Strong, Moderate} yields the final accepted subset Fa,c (s) ⊆ U . For example, the chain
aEng → eAlg → eStrong
selects the candidates {u1 , u4 }, whereas
aPM → eComm → eModerate
selects {u1 , u3 , u5 , u6 }.
2.13 L-Soft Set
An L-soft set maps each parameter to an L-valued set on the universe, enabling lattice-graded,
parameterized membership evaluations [79, 80].
Definition 2.13.1 (L-set). [79] Let X be a nonempty universe and let (L, ≤) be a bounded
lattice (or, more generally, a poset of truth degrees with designated 0L , 1L ). An L-set (also
called an L-fuzzy set) on X is a mapping
A : X −→ L.
For x ∈ X , the value A(x) ∈ L is interpreted as the L-valued grade of membership (truth
degree) of the statement “x ∈ A”. We denote by LX the class of all L-sets on X .
25


# Page. 27

![Page Image](https://bcdn.docswell.com/page/G78D29RZ7D.jpg)

Chapter 2. Types of Soft Set
Definition 2.13.2 (L-soft set). [79] Let X be a nonempty universe, let E be a nonempty set
of parameters, and fix a bounded lattice (L, ≤, 0L , 1L ). Let A ⊆ E be nonempty. An L-soft set
over X (with parameter set A) is a pair
Θ = (F, A),
where
F : A −→ LX .
Equivalently, Θ can be identified with a single mapping
µΘ : A × X −→ L,
µΘ (a, x) := F (a)(x),
so that for each fixed a ∈ A the section x 7→ µΘ (a, x) is an L-set on X . For a ∈ A, the L-set
F (a) ∈ LX is called the a-approximation (or a-evaluation) of Θ.
Example 2.13.3 (An L-soft set for qualitative product-rating in e-commerce). Let X be a finite
set of smartphones
X = {p1 , p2 , p3 , p4 },
and let the parameter set be
A = {Battery, Camera, Price}.
We use a bounded lattice of linguistic grades
L = {L, M, H},
L ≤ M ≤ H,
0L = L, 1L = H,
interpreted as Low, Medium, and High satisfaction, respectively.
Define an L-soft set Θ = (F, A) by specifying, for each parameter a ∈ A, an L-set F (a) ∈ LX
(i.e., a map X → L) that records the qualitative evaluation of each phone:
F (Battery) : X → L, F (Battery)(p1 ) = H, F (Battery)(p2 ) = M, F (Battery)(p3 ) = H, F (Battery)(p4 ) = L,
F (Camera) : X → L, F (Camera)(p1 ) = M, F (Camera)(p2 ) = H, F (Camera)(p3 ) = M, F (Camera)(p4 ) = H,
F (Price) : X → L,
F (Price)(p1 ) = M, F (Price)(p2 ) = L, F (Price)(p3 ) = H, F (Price)(p4 ) = M.
Equivalently, the associated membership map µΘ : A × X → L given by µΘ (a, p) = F (a)(p)
encodes, for each criterion a, the lattice-graded satisfaction level of each product p ∈ X .
Interpretation. For example, µΘ (Battery, p3 ) = H means that p3 is evaluated as high on battery
life, while µΘ (Price, p2 ) = L means that p2 is judged low (unfavorable) in price attractiveness.
Thus Θ models parameterized, qualitative (lattice-valued) assessments without using real-valued
scores.
2.14 PosetSoft set (monotone soft set)
A PosetSoft set is a soft set whose parameter order enforces monotonic inclusion: stronger
parameters yield larger object subsets.
26


# Page. 28

![Page Image](https://bcdn.docswell.com/page/L7LM2WK1JR.jpg)

Chapter 2. Types of Soft Set
Definition 2.14.1 (PosetSoft set (monotone soft set)). Let U be a nonempty universe and let
(A, ) be a nonempty partially ordered set of parameters. A PosetSoft set over U is a soft set
(F, A) with
F : A → P(U )
satisfying the monotonicity constraint
a  b =⇒ F (a) ⊆ F (b)
(a, b ∈ A).
Example 2.14.2 (Example of a PosetSoft set: apartment shortlisting under increasing budget).
Let the universe be a finite set of apartments
U = {u1 , u2 , u3 , u4 , u5 , u6 }.
Assume their monthly rents (in thousand JPY) are:
rent(u1 ) = 75, rent(u2 ) = 80, rent(u3 ) = 92, rent(u4 ) = 100, rent(u5 ) = 108, rent(u6 ) = 125.
Let the parameter set be the set of budget thresholds
A = {80, 100, 120},
equipped with the usual order  := ≤ (so 80  100  120). Define a soft set (F, A) over U by
F (b) = { u ∈ U | rent(u) ≤ b }
(b ∈ A).
Concretely,
F (80) = {u1 , u2 },
Then for any a, b ∈ A,
F (100) = {u1 , u2 , u3 , u4 },
F (120) = {u1 , u2 , u3 , u4 , u5 }.
a  b =⇒ F (a) ⊆ F (b),
because increasing the allowable budget can only add (and never remove) feasible apartments.
Hence (F, A) is a PosetSoft set over U .
Theorem 2.14.3 (Soft-set structure and well-definedness of PosetSoft sets). Let U be a nonempty
universe and let (A, ) be a nonempty poset of parameters. Let F : A → P(U ) be a mapping
satisfying
a  b =⇒ F (a) ⊆ F (b)
(a, b ∈ A).
Then:
(i) (Soft-set structure). (F, A) is a (classical) soft set over U .
(ii) (Well-defined monotonicity predicate). The monotonicity constraint is a well-defined
property of (F, A), i.e., for each comparable pair (a, b) ∈ A × A with a  b, the inclusion
statement F (a) ⊆ F (b) is unambiguous.
(iii) (Induced order-preserving operator). Define ι : P(U )A → {0, 1} by ι(F ) = 1 iff F
satisfies a  b ⇒ F (a) ⊆ F (b). Then ι is well-defined, and the class
PosetSoft(U ; A, ) := {(F, A) | F : A → P(U ), ι(F ) = 1}
of all PosetSoft sets over U (with parameter poset (A, )) is well-defined.
27


# Page. 29

![Page Image](https://bcdn.docswell.com/page/4EMY8925EW.jpg)

Chapter 2. Types of Soft Set
(iv) (Canonical associated relation on U ). Define a binary relation F on U by
x F y
:⇐⇒
(∀a ∈ A) (x ∈ F (a) ⇒ y ∈ F (a)).
Then F is well-defined and is a preorder on U (reflexive and transitive).
Proof. (i) By assumption A 6= ∅ and F : A → P(U ), hence (F, A) is a soft set over U by the
standard definition.
(ii) For any a, b ∈ A with a  b, the sets F (a) and F (b) are uniquely determined subsets of
U because F is a function. Therefore the statement F (a) ⊆ F (b) is unambiguous, and the
implication a  b ⇒ F (a) ⊆ F (b) is a well-defined predicate on (F, A).
(iii) The map ι assigns to each function F ∈ P(U )A a unique truth value in {0, 1}, because
the defining condition is a (well-defined) universal statement over the set of comparable pairs
in A. Hence ι is well-defined, and consequently the subset of soft sets satisfying ι(F ) = 1 is
well-defined; this is exactly PosetSoft(U ; A, ).
(iv) We check reflexivity and transitivity.
Reflexive. Fix x ∈ U . For every a ∈ A, the implication x ∈ F (a) ⇒ x ∈ F (a) holds, hence
x F x.
Transitive. Assume x F y and y F z . Let a ∈ A and suppose x ∈ F (a). From x F y we
obtain y ∈ F (a), and then from y F z we obtain z ∈ F (a). Thus (∀a ∈ A)(x ∈ F (a) ⇒ z ∈
F (a)), i.e., x F z .
Therefore F is a well-defined preorder on U .
2.15 Random soft set
A random soft set is a measurable mapping from outcomes to soft sets, yielding parameterindexed random subsets under uncertainty.
Definition 2.15.1 (Measurable space of soft sets). Fix U and A as above and write
A
SS(U, A) := P(U ) = { F : A → P(U ) },
the set of all soft mappings on (U, A) (equivalently, all soft sets (F, A) over U with this fixed
parameter set).
For each a ∈ A and u ∈ U , define the membership cylinder
Ca,u := { F ∈ SS(U, A) | u ∈ F (a) } ⊆ SS(U, A).
Let
ΣU,A := σ {Ca,u : a ∈ A, u ∈ U }
be the σ -algebra generated by all such cylinders.
28



# Page. 30

![Page Image](https://bcdn.docswell.com/page/PER95GMZJ9.jpg)

Chapter 2. Types of Soft Set
Definition 2.15.2 (Random soft set). Let (Ω, F, P) be a probability space and fix a universe
U and parameter set A. A random soft set (on (U, A)) is an (F, ΣU,A )-measurable mapping

F : (Ω, F) −→ SS(U, A), ΣU,A .
For ω ∈ Ω, write F(ω) = Fω ∈ SS(U, A); then each outcome ω induces a (deterministic) soft set
(Fω , A) over U .
Example 2.15.3 (Random soft set: commuting routes under random traffic). Let U = {r1 , r2 , r3 , r4 }
be a finite set of candidate commuting routes (e.g., train lines or driving routes), and let
A = {aFast , aCheap , aSafe }
be a parameter set, where aFast = “fast”, aCheap = “cheap”, and aSafe = “low incident risk”.
Model the (uncertain) morning condition by a finite probability space
Ω = {ωL , ωN , ωH },
F = 2Ω ,
P(ωL ) = 0.3, P(ωN ) = 0.5, P(ωH ) = 0.2,
where ωL , ωN , ωH represent light/normal/heavy congestion (or disruption) states.
For each outcome ω ∈ Ω, define a soft set (Fω , A) over U by specifying Fω (a) ⊆ U as the set of
routes satisfying criterion a under state ω . For instance, let:
ω = ωL
ω = ωN
ω = ωH
Fω (aFast ) Fω (aCheap ) Fω (aSafe )
{r1 , r2 }
{r3 , r4 }
{r2 , r4 }
{r2 }
{r3 , r4 }
{r2 , r3 }
{r4 }
{r4 }
{r3 , r4 }
(Interpretation: under heavy congestion, only route r4 remains “fast” and also “cheap”, while
the “safe” set depends on disruption patterns.)
Define the mapping
F : Ω −→ SS(U, A),
F(ω) := (Fω , A).
Since U and A are finite, the soft-set space SS(U, A) is finite; taking ΣU,A = 2SS(U,A) , the
map F is automatically (F, ΣU,A )-measurable. Hence F is a random soft set in the sense of
Definition 2.15.2: each realized traffic state ω induces a deterministic soft set (Fω , A) describing,
parameterwise, which routes are acceptable that day.
Proposition 2.15.4 (Pointwise measurability criterion). A mapping F : Ω → SS(U, A) is a
random soft set if and only if for every a ∈ A and u ∈ U the event
{ω ∈ Ω : u ∈ Fω (a)} ∈ F.
Proof. By Definition 2.15.2, F is measurable iff F−1 (B) ∈ F for all B ∈ ΣU,A . Since ΣU,A is
generated by the cylinders Ca,u (Definition 2.15.1), this is equivalent to requiring F−1 (Ca,u ) ∈ F
for all (a, u). But
F−1 (Ca,u ) = {ω ∈ Ω : F(ω) ∈ Ca,u } = {ω ∈ Ω : u ∈ Fω (a)},
which yields the claim.
29


# Page. 31

![Page Image](https://bcdn.docswell.com/page/P7XQKXV1EX.jpg)

Chapter 2. Types of Soft Set
Remark 2.15.5 (Relation to random set theory). For each fixed parameter a ∈ A, the coordinate map
Fa : Ω → P(U ),
Fa (ω) := Fω (a),
is a random subset of U in the sense that all membership events {ω : u ∈ Fa (ω)} are measurable.
Thus a random soft set is precisely a family of random subsets indexed by parameters, packaged as
a single measurable map into the soft-set space; this parallels the standard viewpoint of random
sets as measurable set-valued mappings.
2.16 Capacitary soft set
A capacitary soft set assigns each parameter a normalized monotone capacity on U , representing
nonadditive, uncertainty-aware set evaluations.
Definition 2.16.1 (Capacitary soft set (nonadditive set-function valued)). Let U be a nonempty
finite universe and let
n
o
Cap(U ) := ν : P(U ) → [0, 1]
ν(∅) = 0, ν(U ) = 1, A ⊆ B ⇒ ν(A) ≤ ν(B)
be the family of (normalized) capacities on U . Let A be a nonempty parameter set. A capacitary
soft set over U is a pair (F, A) with
F : A → Cap(U ).
Example 2.16.2 (Real-life example of a capacitary soft set: evaluating cybersecurity control
bundles under different threat contexts). Let U be a finite set of candidate cybersecurity controls:
U = {c1 , c2 , c3 },
where c1 = multi-factor authentication (MFA), c2 = endpoint protection (EDR), and c3 =
network monitoring (NDR). Let A = {aLow , aHigh } be a parameter set of threat contexts (lowthreat vs. high-threat season).
A capacitary soft set (F, A) assigns to each a ∈ A a normalized capacity νa : P(U ) → [0, 1],
which quantifies the (possibly nonadditive) overall risk-reduction effectiveness of any bundle
S ⊆ U under context a.
Define νaLow by
νaLow (∅) = 0,
and
νaLow (U ) = 1,
νaLow ({c1 }) = 0.40,
νaLow ({c2 }) = 0.30,
νaLow ({c3 }) = 0.25,
νaLow ({c1 , c2 }) = 0.70, νaLow ({c1 , c3 }) = 0.65, νaLow ({c2 , c3 }) = 0.55.
Define νaHigh by
νaHigh (∅) = 0,
and
νaHigh ({c1 }) = 0.45,
νaHigh (U ) = 1,
νaHigh ({c2 }) = 0.35,
νaHigh ({c3 }) = 0.35,
νaHigh ({c1 , c2 }) = 0.85, νaHigh ({c1 , c3 }) = 0.80, νaHigh ({c2 , c3 }) = 0.75.
30


# Page. 32

![Page Image](https://bcdn.docswell.com/page/37K95WQM7D.jpg)

Chapter 2. Types of Soft Set
Each νa is a normalized capacity (monotone w.r.t. ⊆, with νa (∅) = 0 and νa (U ) = 1), and it is
typically nonadditive; for instance, under high threat,
νaHigh ({c1 , c2 }) = 0.85 need not equal νaHigh ({c1 }) + νaHigh ({c2 }) = 0.80,
reflecting synergy/overlap effects among controls.
Now define F : A → Cap(U ) by
F (aHigh ) = νaHigh .
F (aLow ) = νaLow ,
Then (F, A) is a capacitary soft set over U in the sense of Definition 2.16.1.
Theorem 2.16.3 (Soft-set structure and well-definedness of capacitary soft sets). Let U be a
nonempty finite universe, let A be a nonempty parameter set, and let
n
o
Cap(U ) := ν : P(U ) → [0, 1]
ν(∅) = 0, ν(U ) = 1, X ⊆ Y ⇒ ν(X) ≤ ν(Y ) .
If (F, A) satisfies F : A → Cap(U ), then:
(i) Soft-set structure. (F, A) is a T -valued soft set over U with codomain T = Cap(U );
equivalently, it is a mapping F : A → T P(U ) whose values are monotone set functions
normalized by ν(∅) = 0 and ν(U ) = 1.
(ii) Well-defined parameter evaluations. For each a ∈ A, the object F (a) is a uniquely
determined capacity on U , i.e., a uniquely determined function
F (a) : P(U ) → [0, 1] with F (a)(∅) = 0, F (a)(U ) = 1, X ⊆ Y ⇒ F (a)(X) ≤ F (a)(Y ).
(iii) Well-defined induced evaluation map. The map
Φ(F,A) : A × P(U ) −→ [0, 1],
Φ(F,A) (a, X) := F (a)(X),
is well-defined (single-valued).
Proof. (i) By assumption, A is nonempty and F is a mapping A → Cap(U ). A (classical) soft
set over a universe Z with parameter set A is any map A → P(Z). Here the “universe” being
evaluated is P(U ) and the codomain is not P(·) but the prescribed codomain T = Cap(U );
thus (F, A) is precisely a T -valued soft set in the sense of Definition 2.19.1. Moreover, each
F (a) ∈ Cap(U ) is, by definition of Cap(U ), a normalized monotone set function on P(U ).
(ii) Fix a ∈ A. Since F is a function, the value F (a) is uniquely determined. Because the
codomain of F is Cap(U ), we have F (a) ∈ Cap(U ), hence F (a) : P(U ) → [0, 1] and the three
axioms F (a)(∅) = 0, F (a)(U ) = 1, and X ⊆ Y ⇒ F (a)(X) ≤ F (a)(Y ) hold for all X, Y ⊆ U .
Therefore each parameter a determines a uniquely defined capacity.
(iii) Define Φ(F,A) (a, X) := F (a)(X) for (a, X) ∈ A × P(U ). This is meaningful because,
by (ii), for each fixed a ∈ A the expression F (a)(X) is defined for every X ∈ P(U ) and
belongs to [0, 1]. Uniqueness follows from the single-valuedness of F (a) as a function. Hence
Φ(F,A) : A × P(U ) → [0, 1] is well-defined.
31


# Page. 33

![Page Image](https://bcdn.docswell.com/page/LJ3WK1P2J5.jpg)

Chapter 2. Types of Soft Set
2.17 CoverSoft set
A CoverSoft set assigns each parameter a cover of U by nonempty subsets, encoding parameterdependent decompositions or granularizations.
Definition 2.17.1 (CoverSoft set). Let U be a nonempty universe. Write
n
o
[
Cov(U ) := C ⊆ P(U ) \ {∅}
C=U
C∈C
for the family of all covers of U by nonempty subsets. Let A be a nonempty parameter set. A
CoverSoft set over U is a pair (F, A) with
F : A → Cov(U ).
Example 2.17.2 (Real-life example of a CoverSoft set: decomposing a delivery region into
service zones under different strategies). Let U be a finite universe of delivery addresses in a
small region:
U = {u1 , u2 , u3 , u4 , u5 , u6 }.
Let A = {aGeo , aTime } be a set of operational parameters, where aGeo denotes a geographic
zoning strategy and aTime denotes a time-window zoning strategy.
A CoverSoft set (F, A) assigns to each a ∈ A a cover F (a) ∈ Cov(U ), i.e., a family of nonempty
subsets of U whose union is U .
(1) Geographic zones. Define
n
o
F (aGeo ) = C1 , C2 , C3 ,
where
C1 = {u1 , u2 },
C2 = {u3 , u4 },
C3 = {u5 , u6 }.
Then each Ci =
6 ∅ and
C1 ∪ C2 ∪ C3 = U,
so F (aGeo ) ∈ Cov(U ).
(2) Time-window zones (overlapping cover). Define
n
o
F (aTime ) = D1 , D2 , D3 ,
where
D1 = {u1 , u3 , u5 } (morning-feasible),
D2 = {u2 , u3 , u4 } (afternoon-feasible),
Again, each Di 6= ∅ and
D1 ∪ D2 ∪ D3 = {u1 , u2 , u3 , u4 , u5 , u6 } = U,
hence F (aTime ) ∈ Cov(U ).
(3) The CoverSoft set. Therefore the mapping F : A → Cov(U ) given by
F (aGeo ) = {C1 , C2 , C3 },
F (aTime ) = {D1 , D2 , D3 },
defines a CoverSoft set (F, A) over U in the sense of Definition 2.17.1.
32
D3 = {u4 , u6 } (evening-f


# Page. 34

![Page Image](https://bcdn.docswell.com/page/8JDK3XL6EG.jpg)

Chapter 2. Types of Soft Set
Theorem 2.17.3 (Soft-set structure and well-definedness of CoverSoft sets). Let U be a nonempty
universe and let A be a nonempty parameter set. Define
n
o
[
Cov(U ) := C ⊆ P(U ) \ {∅}
C=U .
C∈C
Assume (F, A) satisfies F : A → Cov(U ). Then:
(i) (Soft-set structure). (F, A) is a T -valued soft set over U in the sense of Definition 2.19.1
with codomain T = Cov(U ).
(ii) (Well-defined parameterwise covers). For each a ∈ A, the value F (a) is a uniquely
determined cover of U by nonempty subsets; in particular,
[
F (a) ⊆ P(U ) \ {∅} and
C = U.
C∈F (a)
(iii) (Well-defined induced granulation neighborhoods). Define, for (a, u) ∈ A × U ,
N(F,A) (a, u) := { C ∈ F (a) | u ∈ C } ⊆ F (a).
Then N(F,A) : A × U → P(P(U )) is well-defined and satisfies N(F,A) (a, u) 6= ∅ for all
(a, u) ∈ A × U .
Proof. First note that Cov(U ) is nonempty because {U } ∈ Cov(U ) (as U 6= ∅).
(i) Since A 6= ∅ and F : A → Cov(U ), the pair (F, A) is exactly a soft set whose values lie in
the fixed codomain T = Cov(U ); this is precisely the notion of a T -valued soft set.
(ii) Fix a ∈ A. Because F is a function, F (a) is uniquely
determined. Moreover, F (a) ∈ Cov(U )
S
implies by definition that F (a) ⊆ P(U ) \ {∅} and C∈F (a) C = U . Hence F (a) is a well-defined
cover of U by nonempty subsets.
(iii) Fix (a, u) ∈ A × U and define N(F,A) (a, u) = {C ∈ F (a) : u ∈ C}. This set is welldefined because membership u ∈ C is unambiguous for each C ⊆ U . To see nonemptiness, use
S
C∈F (a) C = U from (ii): since u ∈ U , there exists C ∈ F (a) with u ∈ C , hence N(F,A) (a, u) 6=
∅. Therefore N(F,A) is a well-defined (everywhere nonempty) neighborhood/granulation map.
2.18 FiltrationSoft set
A FiltrationSoft set assigns each parameter a nested chain of subsets, representing multi-level
selection stages from strict to relaxed.
33


# Page. 35

![Page Image](https://bcdn.docswell.com/page/VEPK4P4Z78.jpg)

Chapter 2. Types of Soft Set
Definition 2.18.1 (FiltrationSoft set (multi-level subset output)). Let U be a nonempty universe and fix an integer k ≥ 1. Define
n
o
Filk (U ) := (S0 , . . . , Sk ) ∈ P(U )k+1
S0 ⊆ S1 ⊆ · · · ⊆ Sk .
Let A be a nonempty parameter set. A FiltrationSoft set over U (of depth k ) is a mapping
F : A → Filk (U ).
Example 2.18.2 (Real-life example of a FiltrationSoft set: multi-stage hiring shortlists under
different job profiles). Let U be a finite set of job applicants:
U = {u1 , u2 , u3 , u4 , u5 , u6 , u7 }.
Fix depth k = 3, so that each output is a nested chain
(S0 , S1 , S2 , S3 ) ∈ Fil3 (U ) with S0 ⊆ S1 ⊆ S2 ⊆ S3 .
Let the parameter set be
A = {aSE , aDS },
where aSE represents a software engineer profile and aDS represents a data scientist profile.
Define F : A → Fil3 (U ) by specifying, for each profile a ∈ A, four successive shortlists:
• S0 (a): candidates passing a strict initial screen (must-haves),
• S1 (a): candidates passing a technical screen,
• S2 (a): candidates passing interviews,
• S3 (a): candidates considered at all for that profile.
(1) Software engineer profile. Let


F (aSE ) = S0SE , S1SE , S2SE , S3SE := {u1 , u2 }, {u1 , u2 , u4 }, {u1 , u2 , u4 , u6 }, {u1 , u2 , u3 , u4 , u6 } .
Then
S0SE ⊆ S1SE ⊆ S2SE ⊆ S3SE ⊆ U,
so F (aSE ) ∈ Fil3 (U ).
(2) Data scientist profile. Let


F (aDS ) = S0DS , S1DS , S2DS , S3DS := {u2 }, {u2 , u5 }, {u2 , u5 , u7 }, {u2 , u4 , u5 , u7 } .
Again,
S0DS ⊆ S1DS ⊆ S2DS ⊆ S3DS ⊆ U,
so F (aDS ) ∈ Fil3 (U ).
(3) FiltrationSoft set interpretation. Hence F : A → Fil3 (U ) is a FiltrationSoft set over U
in the sense of Definition 2.18.1. Each parameter a ∈ A selects a multi-level (nested) sequence
of acceptable applicants, representing progressively relaxed stages of the hiring pipeline for that
job profile.
34


# Page. 36

![Page Image](https://bcdn.docswell.com/page/27VVX2XM7Q.jpg)

Chapter 2. Types of Soft Set
Theorem 2.18.3 (Soft-set structure and well-definedness of FiltrationSoft sets). Let U be a
nonempty universe, fix an integer k ≥ 1, and define
o
n
Filk (U ) := (S0 , . . . , Sk ) ∈ P(U )k+1
S0 ⊆ S1 ⊆ · · · ⊆ Sk .
Let A be a nonempty parameter set, and let F : A → Filk (U ) be a mapping. Then:
(i) (Nonemptiness of the codomain). Filk (U ) =
6 ∅.
(ii) (Soft-set structure). (F, A) is a T -valued soft set over U in the sense of Definition 2.19.1
with codomain T = Filk (U ).
(iii) (Well-defined filtration at each parameter). For every a ∈ A there exists a unique
tuple
(a)
(a)
F (a) = (S0 , . . . , Sk ) ∈ P(U )k+1
satisfying the nesting chain
(a)
(a)
(a)
S0 ⊆ S1 ⊆ · · · ⊆ Sk .
(iv) (Well-defined stagewise soft sets). For each i ∈ {0, 1, . . . , k} define the i-th stage
map
Fi : A −→ P(U ),
Fi (a) := πi (F (a)),
where πi : Filk (U ) → P(U ) is the projection πi (S0 , . . . , Sk ) = Si . Then each (Fi , A) is a
classical soft set over U , and moreover the family is pointwise monotone:
Fi (a) ⊆ Fi+1 (a)
(∀ a ∈ A, ∀ 0 ≤ i &lt; k).
Proof. (i) Since U =
6 ∅, the tuple (∅, . . . , ∅) ∈ P(U )k+1 satisfies ∅ ⊆ · · · ⊆ ∅, hence
(∅, . . . , ∅) ∈ Filk (U ). Therefore Filk (U ) 6= ∅.
(ii) Because A 6= ∅ and F : A → Filk (U ), the pair (F, A) is, by definition, a soft set whose values
lie in the fixed codomain T = Filk (U ); this is exactly a T -valued soft set as in Definition 2.19.1.
(iii) Fix a ∈ A. Since F is a function, F (a) is uniquely determined. Also, F (a) ∈ Filk (U ) means
(a)
(a)
precisely that F (a) is a (k + 1)-tuple of subsets of U , say F (a) = (S0 , . . . , Sk ), satisfying
(a)
(a)
S0 ⊆ · · · ⊆ Sk . Hence the filtration chain at parameter a is well-defined.
(iv) For each fixed i, the projection πi is a well-defined function from Filk (U ) to P(U ), so
Fi = πi ◦ F : A → P(U ) is well-defined, and therefore (Fi , A) is a classical soft set over U .
(a)
(a)
(a)
(a)
Finally, for any a ∈ A, the tuple F (a) = (S0 , . . . , Sk ) lies in Filk (U ), so Si ⊆ Si+1 for
(a)
(a)
0 ≤ i &lt; k . But Fi (a) = Si and Fi+1 (a) = Si+1 by definition of Fi , hence Fi (a) ⊆ Fi+1 (a) for
all a and i.
35


# Page. 37

![Page Image](https://bcdn.docswell.com/page/5JGLVRVX7L.jpg)

Chapter 2. Types of Soft Set
2.19
T -valued soft set
A T -valued soft set assigns each parameter a T -valued function on the universe, encoding parameterized evaluations without requiring crisp subsets.
Definition 2.19.1 (T -valued soft set (general template)). Let X be a nonempty universe, let E
be a nonempty set of parameters, and let A ⊆ E be nonempty. Let T be a nonempty codomain
set and write T X := { f : X → T }. A T -valued soft set over X (with parameter set A) is a
pair
Θ = (F, A),
where
F : A −→ T X .
Equivalently, Θ can be identified with a single mapping
µΘ : A × X −→ T ,
µΘ (a, x) := F (a)(x),
so that for each fixed a ∈ A the section x 7→ µΘ (a, x) is a T -valued function on X .
Theorem 2.19.2 (Well-definedness and canonical identification of T -valued soft sets). In the
setting of Definition 2.19.1, the two descriptions
F : A→TX
and
µΘ : A × X → T , µΘ (a, x) = F (a)(x),
are equivalent in a canonical (bijection) sense. More precisely, the map
Φ : T X A := {F : A → T X } −→ T A×X := {µ : A × X → T },
Φ(F )(a, x) := F (a)(x),
is a well-defined bijection with inverse
Ψ : T A×X −→ T X A ,
Ψ(µ)(a)(x) := µ(a, x).
Consequently, a T -valued soft set Θ = (F, A) determines a unique map µΘ : A × X → T , and
conversely every such µ determines a unique T -valued soft set (Ψ(µ), A).
Proof. Step 1 (Well-definedness of Φ). Let F : A → T X . For each a ∈ A, F (a) ∈ T X
is (by definition of T X ) a function F (a) : X → T . Hence for each (a, x) ∈ A × X the value
F (a)(x) ∈ T is uniquely determined. Therefore Φ(F ) : A × X → T given by (a, x) 7→ F (a)(x)
is a well-defined function.
Step 2 (Well-definedness of Ψ). Let µ : A × X → T . Fix a ∈ A. Define Ψ(µ)(a) : X → T
by x 7→ µ(a, x). This is a well-defined function in T X . Thus a 7→ Ψ(µ)(a) defines a well-defined
mapping Ψ(µ) : A → T X .
Step 3 (Φ and Ψ are inverses). For F : A → T X and any (a, x) ∈ A × X ,
(Φ ◦ Ψ)(Φ(F ))(a, x) = Φ(F )(a, x) = F (a)(x),
and more directly,
(Ψ ◦ Φ)(F )(a)(x) = Φ(F )(a, x) = F (a)(x) (∀a ∈ A, ∀x ∈ X),
36


# Page. 38

![Page Image](https://bcdn.docswell.com/page/47QY6V65EP.jpg)

Chapter 2. Types of Soft Set
so Ψ(Φ(F )) = F as functions A → T X .
Similarly, for µ : A × X → T and any (a, x) ∈ A × X ,
(Φ ◦ Ψ)(µ)(a, x) = Ψ(µ)(a)(x) = µ(a, x),
so Φ(Ψ(µ)) = µ as functions A × X → T .
Thus Φ is bijective with inverse Ψ. The stated uniqueness claims follow immediately from the
existence of this bijection.
Definition 2.19.3 (Vector-valued soft set). Let X be a nonempty universe, let E be a nonempty
set of parameters, and let A ⊆ E be nonempty. Let V be a vector space over a field K. A vectorvalued soft set over X (with parameter set A) is a V -valued soft set
F : A −→ V X .
ΘV = (F, A),
Equivalently, it is a map µΘV : A × X → V given by µΘV (a, x) = F (a)(x).
Example 2.19.4 (Vector-valued soft set: multi-criteria scoring of job candidates). Let X =
{x1 , x2 , x3 } be three job candidates and let E be a parameter pool. Choose the active parameter
set
A = {Tech, Comm} ⊆ E
(technical skill and communication skill). Let V = R2 .
Define a vector-valued soft set ΘV = (F, A) by specifying, for each a ∈ A, a function F (a) :
X → R2 (so F : A → (R2 )X ):
F (Tech)(x1 ) = (0.80, 0.70), F (Tech)(x2 ) = (0.60, 0.90), F (Tech)(x3 ) = (0.75, 0.65),
F (Comm)(x1 ) = (0.60, 0.55), F (Comm)(x2 ) = (0.40, 0.60), F (Comm)(x3 ) = (0.85, 0.80).
Interpretation: under Tech the two coordinates represent (coding, algorithms) scores, and under
Comm they represent (presentation, teamwork) scores. Equivalently, µΘV : A × X → R2 is given
by µΘV (a, x) = F (a)(x).
Definition 2.19.5 (Matrix-valued soft set). Let X be a nonempty universe, let E be a nonempty
set of parameters, and let A ⊆ E be nonempty. Fix integers m, n ≥ 1 and a field K, and denote
by Matm×n (K) the space of m × n matrices over K. A matrix-valued soft set over X (with
parameter set A) is a Matm×n (K)-valued soft set
ΘM = (F, A),
F : A −→ Matm×n (K)X .
Equivalently, it is a map µΘM : A × X → Matm×n (K) with µΘM (a, x) = F (a)(x).
Example 2.19.6 (Matrix-valued soft set: shift-dependent state-transition estimates of machines). Let X = {M1 , M2 } be two production machines and take
A = {Day, Night} ⊆ E
37


# Page. 39

![Page Image](https://bcdn.docswell.com/page/KE4W4M4VJ1.jpg)

Chapter 2. Types of Soft Set
as operating-shift parameters. Fix m = n = 2 and K = R, so the codomain is Mat2×2 (R).
Define a matrix-valued soft set ΘM = (F, A) by giving, for each a ∈ A, a function F (a) : X →
Mat2×2 (R):




0.97 0.03
0.95 0.05
F (Day)(M1 ) =
,
F (Day)(M2 ) =
,
0.40 0.60
0.30 0.70




0.94 0.06
0.92 0.08
F (Night)(M1 ) =
,
F (Night)(M2 ) =
.
0.50 0.50
0.45 0.55
Interpretation: each 2×2 matrix is a simple estimated transition matrix between states {OK, Fault}
for the corresponding shift (rows = current state, columns = next state). Equivalently, µΘM (a, x) =
F (a)(x) ∈ Mat2×2 (R).
Definition 2.19.7 (Tensor-valued soft set). Let X be a nonempty universe, let E be a nonempty
set of parameters, and let A ⊆ E be nonempty. Fix a field K and vector spaces V1 , . . . , Vr over
K (r ≥ 1), and put
T := V1 ⊗ · · · ⊗ Vr
(the r-th order tensor space of type (V1 , . . . , Vr )). A tensor-valued soft set over X (with parameter set A) is a T -valued soft set
F : A −→ T X .
ΘT = (F, A),
Equivalently, it is a map µΘT : A × X → T with µΘT (a, x) = F (a)(x).
Example 2.19.8 (Tensor-valued soft set: store-dependent 2 × 2 × 2 demand-context tensor).
Let X = {S1 , S2 } be two retail stores and let
A = {Weekday, Weekend} ⊆ E
be context parameters. Fix K = R and take
V1 = R2 (demand: Low/High),
V2 = R2 (weather: Cool/Hot),
V3 = R2 (promotion: Off/On),
so T = V1 ⊗ V2 ⊗ V3 can be represented as 2 × 2 × 2 arrays.
Define a tensor-valued soft set ΘT = (F, A) by specifying F (a) : X → T . For readability, we
write F (a)(x) = [tijk ]i,j,k∈{1,2} as two 2 × 2 slices (promotion Off/On):
Store S1 .

0.30
F (Weekday)(S1 ) : k = 1 (Off) ⇒
0.20

0.20
F (Weekend)(S1 ) : k = 1 (Off) ⇒
0.25


0.10
0.15
, k = 2 (On) ⇒
0.05
0.10


0.10
0.10
, k = 2 (On) ⇒
0.10
0.15

0.05
,
0.05

0.05
.
0.05
Store S2 .




0.35 0.10
0.20 0.05
F (Weekday)(S2 ) : k = 1 (Off) ⇒
, k = 2 (On) ⇒
.
0.15 0.05
0.08 0.02
Interpretation: tijk is a store- and context-dependent weight (e.g., empirical frequency) for the
joint situation (demand i, weather j , promotion k ). Equivalently, µΘT (a, x) = F (a)(x) ∈
V1 ⊗ V2 ⊗ V3 .
38


# Page. 40

![Page Image](https://bcdn.docswell.com/page/L71Y4844JG.jpg)

Chapter 2. Types of Soft Set
2.20 Cubic Soft Set
Cubic soft sets map each parameter to cubic sets: interval-valued membership degrees plus fuzzy
membership, modeling uncertainty with dual information. The definition of the Cubic Soft Set
is described as follows [81–83].
Definition 2.20.1 (Cubic Soft Set). [81] Let X be a nonempty universe and let E be a set of
parameters. First, a cubic set in X is defined as a mapping that assigns to each element x ∈ X
a pair
h[A− (x), A+ (x)], λ(x)i,
where:
• [A− (x), A+ (x)] ⊆ [0, 1] is an interval representing the degree of membership as given by
an interval-valued fuzzy set, and
• λ(x) ∈ [0, 1] is the membership degree provided by a fuzzy set.
Then, a cubic soft set over X with respect to the parameter set E is a mapping
F : E → {cubic sets in X}.
Equivalently, a cubic soft set can be expressed as the collection
Fe = {(e, Fe (e)) : e ∈ E},
where for each parameter e ∈ E , the set Fe (e) is a cubic set in X ; that is,
+
Fe (e) = {hx, [A−
e (x), Ae (x)], λe (x)i : x ∈ X}.
Example 2.20.2 (Real-life example of a cubic soft set: apartment screening with uncertain
scores and point estimates). Let X be a set of apartments:
X = {a1 , a2 , a3 , a4 }.
Let E be a set of decision parameters and consider
E = {eSafe , eComm },
where eSafe = “neighborhood safety” and eComm = “commute convenience”.
A cubic soft set F : E → {cubic sets in X} assigns to each parameter e ∈ E a cubic set on X
of the form
n
o
+
Fe (e) = x, [A−
(x),
A
(x)],
λ
(x)
x
∈
X
,
e
e
e
+
where [A−
e (x), Ae (x)] is an interval-valued membership (reflecting uncertainty in the score) and
λe (x) is a single membership degree (a point estimate).
For instance, based on crime statistics and expert judgment, suppose safety is assessed as:
n
o
Fe (eSafe ) = ha1 , [0.70, 0.85], 0.80i, ha2 , [0.40, 0.60], 0.50i, ha3 , [0.55, 0.75], 0.65i, ha4 , [0.20, 0.35], 0.30i .
Similarly, using travel-time variability data, suppose commute convenience is assessed as:
n
o
Fe (eComm ) = ha1 , [0.30, 0.50], 0.40i, ha2 , [0.75, 0.90], 0.85i, ha3 , [0.50, 0.70], 0.60i, ha4 , [0.60, 0.80], 0.70i .
Then F is a cubic soft set on X : each parameter e is associated with a cubic evaluation of
every apartment, combining an interval-valued degree (uncertainty band) with a representative
point degree. In decision-making, one may aggregate Fe (eSafe ) and Fe (eComm ) to rank apartments
under both safety and commuting considerations.
39


# Page. 41

![Page Image](https://bcdn.docswell.com/page/G7WGXZXZE2.jpg)

Chapter 2. Types of Soft Set
2.21 Probabilistic Soft Set
A probabilistic soft set maps each parameter to a probability distribution over the universe,
modeling uncertainty via normalized likelihoods [84, 85].
Definition 2.21.1 (Probabilistic Soft Set). [84, 85] Let U be a non-empty finite universe and
E be a set of parameters. Let A ⊆ E be a subset of parameters. Denote by
(
)
X
D(U ) = µ : U → [0, 1]
µ(u) = 1
u∈U
the set of all probability distributions on U . A probabilistic soft set over U is a pair (F, A)
where
F : A → D(U )
such that for each e ∈ A, the function F (e) : U → [0, 1] satisfies
X
F (e)(u) = 1.
u∈U
In other words, for every parameter e ∈ A, F (e) is a probability distribution on U .
Example 2.21.2 (Real-life example of a probabilistic soft set: choosing a commuting route
under different criteria). Let U be a finite set of candidate commuting routes:
U = {r1 , r2 , r3 , r4 }.
Let E be a set of decision parameters and take
A = {eFast , eCheap , eComfort } ⊆ E,
where eFast = “fastest”, eCheap = “cheapest”, and eComfort = “most comfortable”.
A probabilistic soft set (F, A) assigns to each parameter e ∈ A a probability distribution on U ,
interpreted as the likelihood that each route is the best choice under criterion e (e.g., estimated
from historical travel data and user preference models).
For instance, define F : A → D(U ) by the following distributions:
r1
r2
r3
r4
F (e)(u)
F (eFast )(·)
0.55 0.25 0.15 0.05
F (eCheap )(·) 0.10 0.20 0.60 0.10
F (eComfort )(·) 0.15 0.50 0.10 0.25
Each row sums to 1, so each F (e) is a probability distribution on U .
Thus (F, A) is a probabilistic soft set over U . Interpretation: under “fastest” the model favors
route r1 , under “cheapest” it favors r3 , and under “most comfortable” it favors r2 .
The comparison of classical soft sets and probabilistic soft sets is presented in Table 2.4.
40


# Page. 42

![Page Image](https://bcdn.docswell.com/page/4JZL616LE3.jpg)

Chapter 2. Types of Soft Set
Table 2.4: Concise comparison of classical soft sets and probabilistic soft sets over a finite universe
U.
Aspect
Soft set
Probabilistic soft set
Universe/parameters
Finite (or general) universe U
and parameter set A ⊆ E .
Finite universe U and parameter set A ⊆ E (typically finite
to interpret distributions).
Value assigned to a parameter a ∈ A
A crisp subset F (a) ⊆ U .
A
probability
distribution F (a) ∈ D(U ), i.e.
F (a) : U → [0, 1] with
P
u∈U F (a)(u) = 1.
Mathematical type of F
F : A → P(U ).
F : A → D(U ) ⊆ [0, 1]U .
Interpretation
“Accepted/feasible objects under parameter a” (yes/no membership).
“Likelihood/degree of preference of each object under parameter a” (normalized uncertainty).
Aggregation across parameters
Often via set operations
(union/intersection) or counting scores.
Often via probabilistic combination
(e.g.,
weighted
mixtures, Bayesian updates,
expected-utility rules).
Relation between the two
Baseline model with crisp information.
Generalizes soft sets: a soft
set can be embedded by using point-mass distributions on
F (a) (or thresholding F (a)).
2.22 D-soft set
A D-soft set maps each parameter to a D-number mass assignment on subsets, handling incompleteness and nonexclusive evidence.
Definition 2.22.1 (D-number on U ). [86,87] Let U be a nonempty finite universe. A D-number
on U is a mapping
D : 2U −→ [0, 1]
satisfying
D(∅) = 0,
X
D(B) ≤ 1.
B⊆U
(Unlike classical Dempster–Shafer basic probability assignments, D-number theory does not require the elements of U to be mutually exclusive, and it allows incomplete information when the
above sum is &lt; 1.) Let
n
o
X
DNum(U ) := D : 2U → [0, 1]
D(∅) = 0,
D(B) ≤ 1
B⊆U
be the set of all D-numbers on U .
Definition 2.22.2 (D-soft set). Let U be a nonempty finite universe and let E be a nonempty
set of parameters. Let A ⊆ E be nonempty. A D-soft set over U with parameter set A is a pair
(F, A) where
F : A −→ DNum(U ).
Thus, for each parameter
P e ∈ A, the value F (e) = De is a D-number on U , i.e., a mass assignment
De : 2U → [0, 1] with B⊆U De (B) ≤ 1.
41


# Page. 43

![Page Image](https://bcdn.docswell.com/page/YE6W2L2MEV.jpg)

Chapter 2. Types of Soft Set
Example 2.22.3 (Real-life example of a D-soft set: supplier selection under incomplete and
nonexclusive evidence). Let U be a finite set of candidate suppliers for a manufacturing order:
U = {s1 , s2 , s3 , s4 }.
Let E be a set of evaluation parameters and take the nonempty subset
A = {eQual , eDeliv } ⊆ E,
where eQual = “high product quality” and eDeliv = “reliable delivery”.
A D-soft set (F, A) assigns to each parameter
e ∈ A a D-number De ∈ DNum(U ), i.e., a map
P
De : 2U → [0, 1] with De (∅) = 0 and B⊆U De (B) ≤ 1.
(1) Evidence for quality. Suppose quality audits provide the following (possibly incomplete)
evidence:
DeQual ({s1 }) = 0.45,
DeQual ({s1 , s3 }) = 0.20,
DeQual ({s2 , s4 }) = 0.05,
DeQual ({s2 }) = 0.10,
DeQual (B) = 0 for all other B ⊆ U.
Then
X
DeQual (B) = 0.45 + 0.20 + 0.10 + 0.05 = 0.80 ≤ 1,
B⊆U
so DeQual ∈ DNum(U ); the remaining mass 1 − 0.80 = 0.20 represents unassigned (unknown)
information.
(2) Evidence for delivery. From shipping records and logistics reports, suppose we obtain:
DeDeliv ({s3 }) = 0.30,
DeDeliv ({s1 , s3 }) = 0.25,
DeDeliv ({s1 , s2 , s3 }) = 0.10,
DeDeliv (B) = 0 for all other B ⊆ U.
Hence
X
DeDeliv (B) = 0.30 + 0.25 + 0.10 = 0.65 ≤ 1,
B⊆U
so DeDeliv ∈ DNum(U ). Note that the focal sets {s3 } ⊆ {s1 , s3 } ⊆ {s1 , s2 , s3 } are not mutually
exclusive, which is allowed in D-number modeling.
(3) The D-soft set. Define F : A → DNum(U ) by
F (eQual ) = DeQual ,
F (eDeliv ) = DeDeliv .
Then (F, A) is a D-soft set over U : each parameter is associated
with a D-number encoding
P
subset-valued evidence, while permitting incompleteness ( De &lt; 1) and nonexclusive focal
sets.
A comparison between a classical soft set and a D-soft set is presented in Table 2.5.
42


# Page. 44

![Page Image](https://bcdn.docswell.com/page/GE5M212QE4.jpg)

Chapter 2. Types of Soft Set
Table 2.5: Concise comparison between a classical soft set and a D-soft set over a (finite) universe
U.
Aspect
Classical soft set (F, A)
D-soft set (F, A)
Universe requirement
U any nonempty set (often finite
in applications).
Nonempty A ⊆ E .
F (e) ∈ P(U ) (a crisp subset of
alternatives).
U is assumed finite since each
value is a D-number on 2U .
Nonempty A ⊆ E .
F (e) = De ∈ DNum(U ) where
De : 2U → [0, 1] (a subset-mass
assignment).
Evidence about which subset(s) of
U are plausible under e, via focal
sets with masses.
Subset-level evidence; can express
ambiguity such as “either u1 or
u3 ” by mass on {u1 , u3 }.
P
D (B) ≤ 1; the gap
B⊆U
P e
1 − B De (B) represents unassigned/unknown information.
D-number framework does not require mutual exclusivity of focal
sets (and may model nonexclusive
evidence).
Ranking via belief/plausibility/credibility transforms or other
mass-to-score rules (applicationdependent).
Parameter domain
Codomain / value type
Meaning of F (e)
Objects accepted under parameter
e (yes/no selection).
Granularity of information
Element-level inclusion only.
Normalization / completeness
No probability-like normalization
constraint.
Mutual exclusivity requirement
Not an evidence model; typically
treats alternatives crisply.
Typical decision extraction
Scoring/ranking by counting satisfied parameters (or other setbased aggregations).
2.23 Complex Soft Sets
A complex soft set assigns to each parameter a complex-valued membership function on the
universe, thereby encoding both magnitude and phase information. Related notions include
complex fuzzy sets [88, 89] and complex neutrosophic sets [90–92].
Definition 2.23.1 (Complex fuzzy set). [93, 94] Let U be a nonempty universe. Define the
closed unit disk
D := { z ∈ C | |z| ≤ 1 }.
A complex fuzzy set (CFS) on U is a mapping
µA : U −→ D.
Equivalently, for each u ∈ U we may write
µA (u) = rA (u) e iωA (u) ,
rA (u) ∈ [0, 1], ωA (u) ∈ [0, 2π],
where rA (u) is the amplitude (membership magnitude) and ωA (u) is the phase. We denote by
CFS(U ) the family of all complex fuzzy sets on U .
Definition 2.23.2 (Complex soft set). Let U be a nonempty universe and let E be a nonempty
set of parameters. Let A ⊆ E be nonempty. A complex soft set (CSS) over U with parameter
set A is a pair (F, A) where
F : A −→ CFS(U ).
43


# Page. 45

![Page Image](https://bcdn.docswell.com/page/9729414WJR.jpg)

Chapter 2. Types of Soft Set
Thus, for each parameter e ∈ A, the value F (e) is a complex fuzzy set on U , i.e.,
F (e) = µe : U −→ D.
Equivalently, a complex soft set can be identified with a single mapping
µ : A × U −→ D,
µ(e, u) := µe (u),
such that for every fixed e ∈ A, the section µe (·) is a complex fuzzy membership function on U .
Remark 2.23.3 (Reductions). 1. If µ(e, u) ∈ [0, 1] ⊂ C (equivalently, all phases are 0), then
(F, A) reduces to a fuzzy soft set.
2. If µ(e, u) ∈ {0, 1} for all (e, u), then (F, A) reduces to a (crisp) soft set.
Example 2.23.4 (Real-life example of a complex soft set: wearable-sensor sleep assessment with
confidence phase). Let U be a set of nights (sleep records) for a person:
U = {n1 , n2 , n3 , n4 }.
Let E be a set of assessment parameters and take
A = {eDeep , eStress } ⊆ E,
where eDeep = “deep-sleep quality” and eStress = “low night-time stress”.
A complex soft set (F, A) assigns to each parameter e ∈ A a complex fuzzy set µe : U → D =
{z ∈ C | |z| ≤ 1}. Interpret the amplitude r(e, n) ∈ [0, 1] as the degree to which night n satisfies
e, and interpret the phase ω(e, n) ∈ [0, 2π] as a confidence/regularity marker derived from signal
quality (e.g., stable sensors yield small phase, noisy sensors yield larger phase).
Define µ : A × U → D by µ(e, n) = r(e, n)eiω(e,n) with the following values:
µ(eDeep , n1 ) = 0.80 e i0.10π ,
µ(eDeep , n2 ) = 0.45 e i0.35π ,
µ(eDeep , n3 ) = 0.70 e i0.15π ,
µ(eStress , n1 ) = 0.60 e i0.20π ,
µ(eStress , n2 ) = 0.30 e i0.55π ,
µ(eStress , n3 ) = 0.75 e i0.10π ,
µ(eDeep , n4 ) = 0.20 e i0.60π
µ(eStress , n4 ) = 0.50 e i0.
For each fixed e ∈ A, the section µe (·) = µ(e, ·) : U → D is a complex membership function, so
it defines a complex fuzzy set F (e) = µe ∈ CFS(U ). Hence (F, A) (equivalently µ) is a complex
soft set.
Interpretation: the magnitude |µ(e, n)| = r(e, n) expresses how well the night n satisfies the
criterion e, while the phase arg(µ(e, n)) = ω(e, n) encodes a secondary aspect such as confidence
or signal stability, which is useful when sensor quality varies across nights.
44


# Page. 46

![Page Image](https://bcdn.docswell.com/page/DJY4MZM97M.jpg)

Chapter 2. Types of Soft Set
2.24 Real Soft Set
A real soft set maps each parameter to a bounded nonempty subset of real numbers, representing
parameter-dependent possible real values [95–97].
Definition 2.24.1 (Soft real set (real soft set)). [97] Let A be a nonempty set of parameters
and let R be the set of real numbers. Define the collection of all nonempty bounded subsets of
R by
Pb (R) := { B ⊆ R | B 6= ∅ and B is bounded }.
A soft real set (also called a real soft set) over R with parameter set A is a pair (F, A), where
F : A −→ Pb (R)
is a mapping. For each λ ∈ A, the value F (λ) ⊆ R is interpreted as the (parameter-dependent)
set of possible real values under the parameter λ.
Remark 2.24.2 (Soft real numbers as singleton soft real sets). If (F, A) is a singleton soft real
set, i.e., F (λ) = {r(λ)} for every λ ∈ A, then it is naturally identified with the corresponding
soft real number
r̃ : A −→ R,
r̃(λ) = r(λ).
Example 2.24.3 (Real-life example of a soft real set: delivery-time windows under different
shipping options). Let A be a set of shipping options for an online store:
A = {λStd , λExp , λEco },
where λStd = standard shipping, λExp = express shipping, and λEco = economy shipping.
Consider the real quantity “delivery time” measured in days, so the value domain is R. Because
delivery times are uncertain but bounded for each option, define
F : A −→ Pb (R)
by assigning to each option the (bounded, nonempty) set of plausible delivery times:
F (λStd ) = [2, 5],
F (λExp ) = [1, 2],
F (λEco ) = [4, 9].
Each F (λ) ⊆ R is nonempty and bounded, hence F (λ) ∈ Pb (R). Therefore (F, A) is a soft real
set over R.
Interpretation: the same order may have different feasible delivery-time windows depending on
the chosen shipping parameter λ ∈ A.
45


# Page. 47

![Page Image](https://bcdn.docswell.com/page/V7NYW3WDE8.jpg)

Chapter 2. Types of Soft Set
2.25 Intersectional soft sets
An intersectional soft set satisfies F (x)∩F (y) ⊆ F (x∗y), ensuring combined parameters include
all jointly satisfying objects [98–101].
Definition 2.25.1 (Intersectional soft set). [98–101] Let U be a universal set and let E be a
set of parameters equipped with a binary operation
∗ : E × E −→ E.
Let A ⊆ E be a nonempty subset and let (F, A) be a soft set over U , i.e.,
F : A −→ P(U ).
Then (F, A) is called an intersectional soft set over U (with respect to ∗) if, for all x, y ∈ A such
that x ∗ y ∈ A, one has
F (x) ∩ F (y) ⊆ F (x ∗ y).
Remark 2.25.2. The condition says that whenever the “combined” parameter x∗y is admissible
(lies in A), it must contain all objects that satisfy both x and y simultaneously.
Example 2.25.3 (Real-life example of an intersectional soft set: product filtering with a “bundle” operation). Let U be a set of products in an online store:
U = {p1 , p2 , p3 , p4 , p5 , p6 }.
Let E be a set of product-tag parameters and take
A = {eBio , eGF , eBioGF } ⊆ E,
where
eBio = “organic”,
eGF = “gluten-free”,
eBioGF = “organic and gluten-free”.
Define a binary operation ∗ : E × E → E (a “bundle” of tags) on the relevant parameters by
eBio ∗ eGF = eBioGF ,
eGF ∗ eBio = eBioGF ,
and set x ∗ x = x for x ∈ A (idempotence on A).
Define a soft mapping F : A → P(U ) by listing products that satisfy each tag:
F (eBio ) = {p1 , p2 , p4 },
F (eGF ) = {p2 , p3 , p5 },
F (eBioGF ) = {p2 }.
Then
F (eBio ) ∩ F (eGF ) = {p2 } ⊆ F (eBioGF ),
so the intersectional condition F (x) ∩ F (y) ⊆ F (x ∗ y) holds for x = eBio and y = eGF (and
similarly for the symmetric order).
Interpretation: the combined tag parameter eBioGF must include every product that is both
organic and gluten-free, ensuring consistency of the tag-bundling rule.
46


# Page. 48

![Page Image](https://bcdn.docswell.com/page/YJ9PX9X873.jpg)

Chapter 2. Types of Soft Set
2.26
N -soft Sets
An N -soft set assigns to each object, for each parameter, a discrete grade from {0, . . . , N − 1},
thereby yielding graded approximations [102–105]. Related notions include N -HyperSoft sets
[106, 107].
Definition 2.26.1 (N -soft set). [102, 103] Let U be a nonempty universe of discourse and let
E be a nonempty set of parameters. Fix an integer N ≥ 2 and let
GN := {0, 1, . . . , N − 1}
be a set of (ordered) grades. Let A ⊆ E be nonempty.
A triple (F, A, N ) is called an N -soft set on U (with parameter set A) if
F : A −→ P(U × GN )
satisfies the uniqueness-of-grade condition: for every a ∈ A and every u ∈ U , there exists a
unique r ∈ GN such that
(u, r) ∈ F (a).
Equivalently, for each a ∈ A the set F (a) ⊆ U × GN is the graph of a unique function
ga : U −→ GN ,
ga (u) = r ⇐⇒ (u, r) ∈ F (a),
and hence the whole N -soft set can be identified with a single information function
g : A × U −→ GN ,
g(a, u) := ga (u).
Example 2.26.2 (Real-life example of an N -soft set: grading students by discrete performance
levels). Let U be a set of students in a class:
U = {s1 , s2 , s3 , s4 , s5 }.
Let E be a set of evaluation parameters and take
A = {aMath , aProg } ⊆ E,
where aMath = “mathematics” and aProg = “programming”.
Fix N = 5 and hence
G5 = {0, 1, 2, 3, 4},
interpreted as ordered grades
0 = very poor ≺ 1 = poor ≺ 2 = average ≺ 3 = good ≺ 4 = excellent.
Define the information function g : A × U → G5 by discrete rubric-based assessments:
g(a, u) aMath aProg
s1
4
3
s2
2
4
s3
1
2
s4
3
1
s5
0
2
47


# Page. 49

![Page Image](https://bcdn.docswell.com/page/GJ8D292ZJD.jpg)

Chapter 2. Types of Soft Set
Equivalently, define F : A → P(U ×G5 ) by taking graphs of the grade functions ga (u) := g(a, u):
F (aMath ) = {(s1 , 4), (s2 , 2), (s3 , 1), (s4 , 3), (s5 , 0)},
F (aProg ) = {(s1 , 3), (s2 , 4), (s3 , 2), (s4 , 1), (s5 , 2)}.
For each a ∈ A and u ∈ U , there is a unique r ∈ G5 such that (u, r) ∈ F (a), so (F, A, 5) is an
N -soft set on U .
Interpretation: each student receives a discrete grade for each parameter (subject), and the
resulting N -soft set supports graded decision rules such as selecting students with g(aProg , u) ≥ 3.
2.27
n-ary soft set
An n-ary soft set maps each parameter to an n-tuple of subsets over multiple universes, enabling
multi-attribute approximations.
Definition 2.27.1 (Binary soft set). [108, 109] Let U1 and U2 be two nonempty universe sets,
and let E be a set of parameters. Fix a nonempty parameter subset A ⊆ E . A binary soft set
over (U1 , U2 ) (with parameter set A) is a pair (F, A), where
F : A −→ P(U1 ) × P(U2 ).
For each e ∈ A, we write
Xe ⊆ U1 , Ye ⊆ U2 .
F (e) = (Xe , Ye ),
Equivalently, (F, A) can be identified with two ordinary soft sets (F1 , A) over U1 and (F2 , A)
over U2 , where F1 (e) := Xe and F2 (e) := Ye for all e ∈ A.
Definition 2.27.2 (n-ary soft set). Let n ∈ N with n ≥ 2, let U1 , . . . , Un be nonempty universe
sets, and let E be a set of parameters. Fix a nonempty parameter subset A ⊆ E . An n-ary soft
set over (U1 , . . . , Un ) (with parameter set A) is a pair (F, A), where
F : A −→
n
Y
P(Ui ).
i=1
Thus, for each parameter e ∈ A we have an n-tuple
F (e) = (Xe(1) , Xe(2) , . . . , Xe(n) ),
Xe(i) ⊆ Ui (i = 1, . . . , n).
Equivalently, (F, A) is uniquely determined by an n-tuple of (ordinary) soft sets
(Fi , A) over Ui
(i = 1, . . . , n),
via the component maps Fi : A → P(Ui ) defined by
Fi (e) := Xe(i)
(e ∈ A, i = 1, . . . , n),
so that F (e) = (F1 (e), . . . , Fn (e)) for all e ∈ A.
48


# Page. 50

![Page Image](https://bcdn.docswell.com/page/LJLM2W21ER.jpg)

Chapter 2. Types of Soft Set
2.28 Linguistic Soft Set
A linguistic soft set maps parameters to subsets of the universe by using ordered linguistic terms,
thereby enabling qualitative evaluations instead of numeric membership degrees. Linguistic
hypersoft sets have been studied in the literature [110, 111]; here we rewrite the concept in
the (classical) linguistic soft set form. Related models have also been investigated in various
contexts [112–116].
Definition 2.28.1 (Linguistic Soft Set). [110, 111] Let Ω be a finite universe of objects (e.g.,
rural health service centers), and let E be a nonempty set of parameters (criteria). Fix a
nonempty subset A ⊆ E of parameters to be used.
For each parameter e ∈ A, let Υe be a finite, strictly ordered set of linguistic values,
Υe = {κe,1 , κe,2 , . . . , κe,me },
κe,1 ≺ κe,2 ≺ · · · ≺ κe,me ,
(where, for example, κe,1 = “very low” and κe,me = “very high”). Let
G
Υ :=
Υe
e∈A
denote the disjoint union of the linguistic term sets.
A linguistic soft set (LSS) over Ω with parameter set A is a pair (Γ, A), where
Γ : A −→ P(Ω) × Υ
is a mapping such that, for each e ∈ A, there exists a linguistic term κe ∈ Υe with

Γ(e) = Γe , κe ,
Γe ∈ P(Ω), κe ∈ Υe .
Equivalently, one may view an LSS as an annotated soft set
n
o

(Γ, A) = e, κe , Γe
e∈A ,
where Γe is the subset of objects described (or selected) by the linguistic evaluation κe under
the criterion e.
In applications, the annotation κe is determined by experts or data-driven rules, and Γe collects
the objects in Ω that satisfy the parameter e at the linguistic level κe .
Example 2.28.2 (Real-life example of a Linguistic Soft Set: rating restaurants by linguistic
service/price levels). Let Ω be a small set of restaurants:
Ω = {r1 , r2 , r3 , r4 , r5 }.
Let E be a set of evaluation criteria and take
A = {eSrv , ePrice } ⊆ E,
where eSrv = “service quality” and ePrice = “price level”.
49


# Page. 51

![Page Image](https://bcdn.docswell.com/page/47MY89857W.jpg)

Chapter 2. Types of Soft Set
For each e ∈ A, fix an ordered set of linguistic values:
ΥeSrv = {poor ≺ fair ≺ good ≺ excellent},
ΥePrice = {cheap ≺ moderate ≺ expensive}.
Let Υ = ΥeSrv t ΥePrice .
Define Γ : A → P(Ω) × Υ by assigning to each criterion a linguistic label together with the
subset of restaurants that match that label (according to aggregated reviews):


Γ(eSrv ) = {r1 , r3 }, excellent ,
Γ(ePrice ) = {r2 , r5 }, cheap .
Equivalently, the linguistic soft set is the annotated collection
n

o
(Γ, A) = eSrv , excellent, {r1 , r3 } , ePrice , cheap, {r2 , r5 } .
Interpretation: under the criterion “service quality”, restaurants r1 and r3 are assessed as excellent; under “price level”, restaurants r2 and r5 are assessed as cheap. Such a linguistic soft
set supports qualitative filtering (e.g., seeking restaurants with excellent service or cheap price)
without introducing numeric membership degrees.
2.29 MetaSoft Set
A MetaSoft set is a soft set whose universe consists of soft sets, classifying families of soft sets
by meta-parameters [117]. We first fix a general single-sorted, finitary signature

Σ = Func, Rel, arFunc , arRel ,
where Func (resp. Rel) is a set of function (resp. relation) symbols, and ar records arities. A
(single-sorted) Σ-structure is

C = H, (f C )f ∈Func , (RC )R∈Rel ,
with carrier H 6= ∅, interpretations f C : H m → H for each f ∈ Func of arity m, and relations
RC ⊆ H r for each R ∈ Rel of arity r. Let StrΣ denote the class of all Σ-structures.
Definition 2.29.1 (MetaStructure over a fixed signature). (cf. [118]) Fix Σ as above. A MetaStructure (“structure of structures”) over Σ is a pair

M = U, (Φ` )`∈Λ ,
where:
• U is a nonempty set with U ⊆ StrΣ (its elements are objects at level 0);
• for each label ` ∈ Λ of meta-arity k` ∈ N, the meta-operation
Φ` : U k` −→ U
50


# Page. 52

![Page Image](https://bcdn.docswell.com/page/P7R95G5ZE9.jpg)

Chapter 2. Types of Soft Set
is specified by uniform carrier- and symbol-constructors:
Γ` : (C1 , . . . , Ck` ) 7→ H`
(new carrier H` built functorially);
∀f ∈ Func :

f Φ` (C1 ,...,Ck` ) = Λ`f f C1 , . . . , f Ck` ;
∀R ∈ Rel :
R Φ` (C1 ,...,Ck` )

= Ξ`R RC1 , . . . , RCk` ,
where Λ`f and Ξ`R are uniform recipes turning the symbols’ interpretations on inputs into
the symbol’s interpretation on the output, over the new carrier H` .
Moreover, each Φ` is isomorphism-invariant (a.k.a. natural): if αi : Ci ∼
= Di for 1 ≤ i ≤ k` ,
then there is an induced isomorphism
Φ` (α1 , . . . , αk` ) : Φ` (C1 , . . . , Ck` )
∼
=
−−→ Φ` (D1 , . . . , Dk` )
commuting with all interpretations of symbols of Σ.
Definition 2.29.2 (MetaSoft Set). Let U be a nonempty universe of objects and let S be a
(possibly finite) set of parameters. A (crisp) soft set on (U, S) is a mapping
F : S −→ P(U ),
and we denote by

Soft(U, S) := F | F : S → P(U )
the collection of all such soft sets on (U, S).
Let Π be a nonempty set of meta-parameters. A MetaSoft Set on (U, S) with meta-parameter
set Π is a soft set over the universe Soft(U, S) with parameter set Π, that is, a pair
(G, Π) where

G : Π −→ P Soft(U, S) .
For each π ∈ Π, the value G(π) ⊆ Soft(U, S) is interpreted as the family of base soft sets that
satisfy the meta-criterion encoded by π .
Example 2.29.3 (Real-life example of a MetaSoft Set: selecting suitable recommendation profiles). Let U be a set of customers of an online grocery service:
U = {c1 , c2 , c3 , c4 }.
Let S be a set of product-category parameters:
S = {sV , sG , sL },
where sV = “prefers vegan items”, sG = “prefers gluten-free items”, sL = “prefers low-sugar
items”.
51


# Page. 53

![Page Image](https://bcdn.docswell.com/page/PJXQKXK17X.jpg)

Chapter 2. Types of Soft Set
A (base) soft set on (U, S) is a map F : S → P(U ). Consider three candidate recommendation
profiles (three base soft sets) F1 , F2 , F3 ∈ Soft(U, S) defined by:
F1 (sV ) = {c1 , c3 },
F1 (sG ) = {c2 , c3 },
F2 (sV ) = {c1 , c2 , c3 },
F3 (sV ) = {c4 },
F2 (sG ) = {c3 },
F3 (sG ) = {c2 , c4 },
F1 (sL ) = {c1 , c2 },
F2 (sL ) = {c1 },
F3 (sL ) = {c2 , c3 , c4 }.
Thus {F1 , F2 , F3 } ⊆ Soft(U, S) is a small universe of possible “recommendation rules” (soft-set
profiles).
Now let Π be a set of meta-parameters describing constraints on such profiles:
Π = {πBal , πInc },
where
πBal = “balanced coverage”,
Define G : Π → P(Soft(U, S)) by
n
G(πBal ) = F ∈ Soft(U, S)
πInc = “inclusive vegan”.
o
F(sV ) ≥ 2, F(sG ) ≥ 2, F(sL ) ≥ 2 ,
n
G(πInc ) = F ∈ Soft(U, S)
o
F(sV ) ⊇ {c1 , c3 } .
Then (G, Π) is a MetaSoft Set: for each meta-parameter π ∈ Π, G(π) is a family of base soft
sets (recommendation profiles) satisfying the meta-criterion π .
For the concrete profiles above, one checks that
F1 ∈ G(πBal ),
F2 ∈ G(πInc ),
F3 ∈
/ G(πInc ),
illustrating how a MetaSoft Set can be used to select or filter candidate soft-set profiles according
to higher-level design requirements.
2.30 Double-framed Soft Set
A double-framed soft set assigns each parameter positive and negative approximation subsets,
often constrained by an operation on parameters [119–122]. Moreover, further extensions have
been studied, including N -framed soft sets [123–125], double-framed HyperSoft sets [126, 127],
and Double-Framed SuperHyperSoft Set [128, 129].
Definition 2.30.1 (Double-Framed Soft Set). Let U be a universal set and let A be a (nonempty)
set of parameters. Assume that A is equipped with a binary operation
∗ : A × A −→ A.
A double-framed soft set over U (with parameter set A) is a triple
h(α, β); Ai,
where
α : A → P(U )
and
β : A → P(U )
are mappings. For each x ∈ A, the set α(x) is interpreted as the positive frame and β(x) as the
negative frame. Moreover, the following compatibility conditions are required: for all x, y ∈ A,
α(x ∗ y) ⊇ α(x) ∩ α(y),
β(x ∗ y) ⊆ β(x) ∪ β(y).
52


# Page. 54

![Page Image](https://bcdn.docswell.com/page/3JK95W5MJD.jpg)

Chapter 2. Types of Soft Set
Remark 2.30.2. If one does not intend to use an algebraic operation on the parameter set,
then a double-framed soft set may be taken simply as a pair of maps α, β : A → P(U ) (i.e., the
triple h(α, β); Ai) without imposing the above ∗-compatibility axioms.
Example 2.30.3 (Real-life example of a double-framed soft set: loan pre-screening with positive
and negative evidence). Let U be a set of loan applicants:
U = {u1 , u2 , u3 , u4 , u5 , u6 }.
Let A be a set of screening parameters:
A = {aInc , aCred , aIncCred }.
Interpret aInc = “high income”, aCred = “good credit history”, and aIncCred = “high income and
good credit”.
Define a binary operation ∗ : A × A → A (parameter combination) by
aInc ∗ aCred = aIncCred ,
aCred ∗ aInc = aIncCred ,
x ∗ x = x (x ∈ A).
Define two set-valued maps α, β : A → P(U ) as follows. For each parameter x ∈ A:
• α(x) collects applicants with positive evidence supporting x,
• β(x) collects applicants with negative evidence against x.
Assume the bank’s initial data yield:
α(aInc ) = {u1 , u2 , u4 },
β(aInc ) = {u3 , u5 },
α(aCred ) = {u1 , u3 , u6 },
β(aCred ) = {u2 , u4 },
and for the combined parameter take
α(aIncCred ) = {u1 },
β(aIncCred ) = {u2 , u3 , u4 , u5 }.
Then the compatibility conditions hold:
α(aIncCred ) ⊇ α(aInc ) ∩ α(aCred ) = {u1 , u2 , u4 } ∩ {u1 , u3 , u6 } = {u1 },
and
β(aIncCred ) ⊆ β(aInc ) ∪ β(aCred ) = {u3 , u5 } ∪ {u2 , u4 } = {u2 , u3 , u4 , u5 }.
Hence h(α, β); Ai is a double-framed soft set over U .
Interpretation: α lists candidates supported by a criterion, β lists candidates contradicted by
it, and the operation ∗ combines criteria so that positive support becomes at least as strict
(intersection), while negative evidence is at most as broad (union).
53


# Page. 55

![Page Image](https://bcdn.docswell.com/page/LE3WK1K2E5.jpg)

Chapter 2. Types of Soft Set
2.31 Bijective Soft Set
A bijective soft set partitions the universe into disjoint parameter blocks, covering all elements,
assigning each uniquely to one parameter [130–132]. As an extension, concepts such as bijective
HyperSoft sets [133–135] have also been studied.
Definition 2.31.1 (Bijective soft set). [130–132] Let U be a nonempty universe and let E be
a set of parameters. A soft set over U is a pair (F, B), where B ⊆ E is a nonempty parameter
set and F : B → P(U ).
The soft set (F, B) is called a bijective soft set if the family of subsets {F (e)}e∈B forms a partition
of U , i.e.,
(B1) Covering:
[
F (e) = U ;
e∈B
(B2) Pairwise disjointness: for all e1 , e2 ∈ B with e1 6= e2 , one has F (e1 ) ∩ F (e2 ) = ∅;
(B3) Nontrivial blocks (optional but standard for literal bijectivity): F (e) 6= ∅ for all
e ∈ B.
Equivalently, for every u ∈ U there exists a unique parameter e ∈ B such that u ∈ F (e).
If we denote the image family by
Y := { F (e) | e ∈ B } ⊆ P(U ),
then (under (B3)) the map F : B → Y is a bijection.
Remark 2.31.2. Many papers state bijective soft sets using only (B1)–(B2). In that case, literal
bijectivity holds after discarding any parameters with empty images: set B ∗ := {e ∈ B | F (e) 6=
∅} and restrict F to B ∗ . Then {F (e)}e∈B ∗ is a partition of U and F : B ∗ → {F (e) | e ∈ B ∗ }
is bijective.
Example 2.31.3 (Real-life example of a bijective soft set: unique department assignment). Let
U be the set of employees in a company:
U = {u1 , u2 , u3 , u4 , u5 , u6 , u7 }.
Let E be a set of parameters and choose a nonempty subset
B = {eHR , eENG , eFIN , eMKT } ⊆ E,
where each parameter denotes a department:
eHR = Human Resources,
eENG = Engineering,
54
eFIN = Finance,
eMKT = Marketing.


# Page. 56

![Page Image](https://bcdn.docswell.com/page/8EDK3X367G.jpg)

Chapter 2. Types of Soft Set
Define a mapping F : B → P(U ) by the employees assigned to each department:
F (eHR ) = {u1 , u6 },
F (eENG ) = {u2 , u3 , u7 },
F (eFIN ) = {u4 },
F (eMKT ) = {u5 }.
Then e∈B F (e) = U (every employee belongs to some department), and the sets F (eHR ), F (eENG ), F (eFIN ), F (e
are pairwise disjoint (no employee belongs to two departments simultaneously). Moreover, each
F (e) is nonempty.
S
Hence {F (e)}e∈B is a partition of U , so (F, B) is a bijective soft set. Equivalently, each employee
u ∈ U has a unique department-parameter e ∈ B such that u ∈ F (e).
2.32 Ranked Soft Set
Ranked soft sets map each parameter to an ordered partition of the universe, expressing graded
satisfaction levels for uncertain decision-making. The definition of the Ranked Soft Set is described as follows [136].
Definition 2.32.1 (Ranked Soft Set). [136] Let U be a nonempty finite universe and let E be
a set of parameters. A ranked partition of U is an ordered collection
V = (V 0 , V 1 , . . . , V k )
of subsets of U satisfying:
1. V 0 ∪ V 1 ∪ · · · ∪ V k = U,
2. For indices i, j with 0 ≤ i &lt; j ≤ k , the elements in V j are regarded as satisfying the
corresponding attribute with a higher degree than those in V i .
A ranked soft set over U is a pair (R, E) where
R : E → R(U )
is a mapping from the set of parameters E to the family R(U ) of all ranked partitions of U .
That is, for each t ∈ E , R(t) is a ranked partition of U representing the graded evaluation of
the property t on the elements of U .
Example 2.32.2 (Real-life example of a ranked soft set: hotel recommendation by cleanliness).
Let U be a finite set of hotels in a city:
U = {h1 , h2 , h3 , h4 , h5 }.
Let E be a set of evaluation parameters and consider the parameter
t = “cleanliness” ∈ E.
55


# Page. 57

![Page Image](https://bcdn.docswell.com/page/V7PK4PPZJ8.jpg)

Chapter 2. Types of Soft Set
Assume we use four ordered satisfaction levels (from worst to best), so k = 3 and we form a
ranked partition
R(t) = (V 0 , V 1 , V 2 , V 3 ),
where each V j ⊆ U collects hotels assessed at rank j , and higher j means cleaner.
For instance, based on recent inspection reports and user reviews, suppose we obtain:
V 0 = {h4 } (poor cleanliness),
V 1 = {h2 } (fair cleanliness),
V 2 = {h3 , h5 } (good cleanliness),
V 3 = {h1 } (excellent cleanliness).
Then V 0 ∪ V 1 ∪ V 2 ∪ V 3 = U , and for 0 ≤ i &lt; j ≤ 3, the hotels in V j are regarded as satisfying
the parameter t more strongly than those in V i .
Define a ranked soft set R : E → R(U ) by specifying such a ranked partition for each parameter.
In particular, the value R(t) above encodes a graded evaluation of cleanliness on U , which can
be used to recommend hotels by prioritizing higher-ranked blocks (e.g., selecting from V 3 first,
then V 2 , etc.).
2.33 Refined Soft Set
Refined soft sets index multiple soft sets by secondary parameters, representing several evaluators’ mappings from common attributes to subsets simultaneously. Related notions include
refined neutrosophic sets [137–139]. The definition of the Refined Soft Set is described as follows [140–142].
Definition 2.33.1 (Refined Soft Set). [140] Let U be a nonempty universe, and let E and F
be two sets of parameters with
E ∩ F = ∅.
Let A be a nonempty subset of E . For each parameter b ∈ F , let
fb : A → P(U )
be a soft set over U ; that is, for each a ∈ A, we have fb (a) ⊆ U . Then the collection
{ (fb , A) | b ∈ F }
is called a refined soft set over U with respect to the parameter set A and the indexing set F .
Equivalently, a refined soft set may be viewed as a mapping
f : F → {soft sets over U with parameter set A},
defined by f (b) = fb for all b ∈ F .
Example 2.33.2 (Real-life example of a refined soft set: multi-expert product screening). Let
U be a set of job applicants for a software engineer position:
U = {u1 , u2 , u3 , u4 , u5 , u6 }.
56


# Page. 58

![Page Image](https://bcdn.docswell.com/page/2JVVX22MJQ.jpg)

Chapter 2. Types of Soft Set
Let E be a parameter set of evaluation criteria and take a nonempty subset
A = {α1 , α2 , α3 } ⊆ E,
where
α1 = “strong algorithms”,
α2 = “cloud experience”,
α3 = “good communication”.
Let F be an indexing set of evaluators (secondary parameters), disjoint from E :
F = {β1 , β2 , β3 },
where
β1 = HR,
E ∩ F = ∅,
β2 = Engineering manager,
β3 = Senior engineer.
For each evaluator β ∈ F , define a soft set
fβ : A −→ P(U ),
where fβ (α) is the subset of applicants judged by evaluator β to satisfy criterion α. For instance,
suppose the assessments are:
fβ1 (α1 ) = {u1 , u2 , u4 },
fβ1 (α2 ) = {u2 , u3 , u5 },
fβ1 (α3 ) = {u1 , u3 , u6 },
fβ2 (α1 ) = {u1 , u4 , u5 },
fβ2 (α2 ) = {u2 , u5 , u6 },
fβ2 (α3 ) = {u1 , u2 , u6 },
fβ3 (α1 ) = {u2 , u4 , u6 },
fβ3 (α2 ) = {u1 , u3 , u6 },
fβ3 (α3 ) = {u1 , u4 , u5 }.
Then the collection
{(fβ , A) | β ∈ F }
is a refined soft set over U : it records, for the same primary criteria set A, multiple soft sets
indexed by evaluators in F . Equivalently, it is the mapping
f : F −→ {soft sets over U with parameter set A},
f (β) = fβ .
In practice, such a refined soft set supports consensus or aggregation rules (e.g., selecting applicants satisfying α1 according to at least two evaluators).
2.34 MultiSoft Set
A MultiSoft set maps multiple parameters to subsets of the universe, representing simultaneous
parameterized approximations for decision-making tasks [143–145]. Related notions also include
multi-fuzzy sets [146, 147] and multi-neutrosophic sets [148, 149].
Definition 2.34.1 (Multisoft set). [143–145] Let U be a nonempty universe of discourse and
let E be a (nonempty) set of parameters. Let A ⊆ E be a (nonempty) subset of parameters,
and write P(U ) for the power set of U . A pair (F, A) is called a multisoft set over U if
F : A −→ P(U ).
For each parameter a ∈ A, the subset F (a) ⊆ U is the a-approximation (or a-value set) of the
multisoft set (F, A).
57


# Page. 59

![Page Image](https://bcdn.docswell.com/page/5EGLVRRXJL.jpg)

Chapter 2. Types of Soft Set
Example 2.34.2 (Real-life example of a multisoft set: smartphone selection by multiple criteria). Let U be a finite set of smartphone models:
U = {s1 , s2 , s3 , s4 , s5 , s6 }.
Let E be a set of decision parameters and take the nonempty subset
A = {aCam , aBatt , aPrice } ⊆ E,
where aCam = “good camera”, aBatt = “long battery life”, and aPrice = “affordable price”.
Define a mapping F : A → P(U ) by listing the models that satisfy each criterion according to
reviews and specifications:
F (aCam ) = {s1 , s3 , s5 },
F (aBatt ) = {s2 , s3 , s6 },
F (aPrice ) = {s1 , s2 , s4 }.
Then (F, A) is a multisoft set over U .
Interpretation: the family {F (a)}a∈A provides multiple parameterized approximations of U ,
supporting multi-criteria selection such as choosing phones in F (aCam ) ∩ F (aBatt ) (good camera
and long battery life).
2.35 GraphicSoft Set
A GraphicSoft Set further generalizes this framework by mapping each subgraph of an attribute
graph to a subset of the universe, thereby embedding inter‐attribute relationships into the soft‐set
model [150].
Definition 2.35.1 (GraphicSoft Set). [150] Let U be a universe of discourse, and let G = (V, E)
be a graph representing a set of attributes and their relationships. A GraphicSoft Set is defined
as a mapping
F : P(G) → P(U ),
which assigns to each subgraph H ∈ P(G) a subset F (H) ⊆ U . Intuitively, F (H) represents
the set of objects in U that possess the combined attributes described by the subgraph H .
Example 2.35.2 (Real-life example of a GraphicSoft Set: diet-oriented product search). Let U
be a small catalog of packaged foods:
U = {p1 , p2 , p3 , p4 , p5 , p6 },
where p1 = tofu bowl, p2 = chicken salad, p3 = gluten-free quinoa crackers, p4 = vegan protein
bar, p5 = Greek yogurt, p6 = oat milk.
Let G = (V, E) be an attribute graph whose vertices represent dietary attributes:
V = {vV , vG , vL , vP },
where vV = Vegan, vG = Gluten-free, vL = Low-sugar, vP = High-protein. Assume the edges
encode compatibility/certification requirements between attributes:

E = {vV , vG }, {vV , vL }, {vL , vP } .
58


# Page. 60

![Page Image](https://bcdn.docswell.com/page/4JQY6VV57P.jpg)

Chapter 2. Types of Soft Set
For each vertex v ∈ V , define the set of products having attribute v :
S(v) ⊆ U,
and for each edge {v, w} ∈ E , define the set of products satisfying the joint requirement (e.g.,
certified compatible) for (v, w):
C(v, w) ⊆ U.
For concreteness, take
S(vV ) = {p1 , p4 , p6 },
S(vG ) = {p1 , p3 , p4 },
S(vL ) = {p3 , p4 , p5 },
S(vP ) = {p1 , p2 , p4 , p5 },
and
C(vV , vG ) = {p1 , p4 },
C(vV , vL ) = {p4 },
C(vL , vP ) = {p4 , p5 }.
Define a mapping F : P(G) → P(U ) as follows. For any subgraph H = (V (H), E(H)) of G,
set
 \



\
F (H) :=
S(v) ∩
C(v, w) ,
{v,w}∈E(H)
v∈V (H)
with the convention that the intersection over an empty family equals U .
Then F (H) returns the products satisfying the attributes in V (H) together with the relational
constraints in E(H). For instance:
1. If H1 has V (H1 ) = {vV , vG } and E(H1 ) = {{vV , vG }}, then
F (H1 ) = S(vV ) ∩ S(vG ) ∩ C(vV , vG ) = {p1 , p4 }.
2. If H2 has V (H2 ) = {vV , vL , vP } and E(H2 ) = {{vV , vL }, {vL , vP }}, then
F (H2 ) = S(vV ) ∩ S(vL ) ∩ S(vP ) ∩ C(vV , vL ) ∩ C(vL , vP ) = {p4 }.
Thus (F, G) is a GraphicSoft Set modeling diet-oriented retrieval where edges encode compatibility/certification relations among attributes.
For reference, a comparison between Soft Sets and GraphicSoft Sets is provided in Table 2.6.
A related concept is that one can define a DAG-soft set as follows. A directed acyclic graph
(DAG) is a directed graph containing no directed cycles, enabling topological ordering of vertices
for computation [151–154].
Definition 2.35.3 (DAGSoft set (acyclic-hierarchical parameter soft set)). Let U be a nonempty
universe and let D = (A, →) be a finite directed acyclic graph (DAG) of parameters. A DAGSoft
set over U is a soft set (F, A), F : A → P(U ), satisfying the hierarchical coherence rule
a → b =⇒ F (b) ⊆ F (a)
59
(a, b ∈ A).


# Page. 61

![Page Image](https://bcdn.docswell.com/page/K74W4MMVE1.jpg)

Chapter 2. Types of Soft Set
Table 2.6: Concise comparison between Soft Sets and GraphicSoft Sets over a universe U .
Aspect
Universe
Parameter carrier
Indexing domain
of the map
Basic data (definition)
What the “parameter” means
How
attributerelations
are
represented
Admissible combinations
Granularity / expressiveness
Soft Set
Fixed universe of discourse U .
A (usually finite) parameter set S ⊆ A
(attributes).
Parameters e ∈ S .
A map F : S → P(U ); the soft set is
(F, S).
A single attribute/criterion e (treated
independently unless combined externally).
Not represented intrinsically (no
edges/constraints among parameters in
the basic model).
Typically handled by taking finite families of parameters (e.g., decision rules
using multiple e’s), but this is external
to F .
“Unary” parameterization: F assigns
sets to individual parameters.
Model size (typical)
Depends on |S| evaluations of F (e).
Natural reduction
/ embedding
Base model.
Typical use-cases
Decision making with attribute-wise approximations; uncertainty via parameterized subsets.
GraphicSoft Set
Same: fixed universe U .
An attribute graph G = (V, E) encoding
attributes (V ) and their relations (E ).
Subgraphs H ∈ Sub(G) (or equivalently
P(G)).
A map F : Sub(G) → P(U ); the structure is (F, G).
A pattern of interacting attributes: a
subgraph H (vertices + edges) represents a chosen combination together
with their relationships.
Represented intrinsically via E(H):
edges record dependency, compatibility,
adjacency, etc.
Built-in: any subgraph H is an admissible “combined parameter”; F (H) directly models objects satisfying the combined/related attributes encoded by H .
“Structured” parameterization: F assigns sets to structured parameterobjects (subgraphs), capturing multiattribute interactions explicitly.
Potentially much larger: requires values
for many subgraphs (in worst case exponential in |V | + |E|).
Contains soft-set-like information by restricting to subgraphs representing single attributes (e.g., isolated-vertex subgraphs). Conversely, a soft set can be
viewed as a degenerate case where only
“atomic” subgraphs are used.
Decision/knowledge modeling where relations among attributes matter (dependencies, synergies, conflicts), and
where one wants selections indexed by
attribute-interaction patterns.
Example 2.35.4 (Real-life example of a DAGSoft set: IT helpdesk ticket taxonomy with hierarchical parameters). Let U be a finite set of IT helpdesk tickets:
U = {t1 , t2 , t3 , t4 , t5 , t6 , t7 }.
Consider a parameter DAG D = (A, →) describing an issue taxonomy:
A = {aAcc , aConn , aWiFi , aVPN , aAuth , aSSO },
where the intended meanings are
aAcc = “access problem”,
aConn = “connectivity problem”,
aWiFi = “Wi-Fi problem”,
aVPN = “VPN problem
Let the directed edges encode refinement (child is more specific):
aAcc → aConn ,
aAcc → aAuth ,
aConn → aWiFi ,
This digraph is acyclic (a DAG).
60
aConn → aVPN ,
aAuth → aSSO .


# Page. 62

![Page Image](https://bcdn.docswell.com/page/LJ1Y4884EG.jpg)

Chapter 2. Types of Soft Set
Define a soft mapping F : A → P(U ) by
F (aAcc ) = {t1 , t2 , t3 , t4 , t5 , t6 , t7 },
F (aConn ) = {t1 , t2 , t4 , t6 },
F (aWiFi ) = {t1 , t2 },
F (aVPN ) = {t4 , t6 },
F (aAuth ) = {t3 , t5 , t7 },
F (aSSO ) = {t5 , t7 }.
Then the hierarchical coherence rule in Definition 2.35.3 holds, because each directed edge a → b
satisfies F (b) ⊆ F (a); for instance,
F (aWiFi ) ⊆ F (aConn ) ⊆ F (aAcc ),
F (aSSO ) ⊆ F (aAuth ) ⊆ F (aAcc ).
Hence (F, A) is a DAGSoft set over U , representing a parameterized classification of tickets
in which more specific issue-types select subsets of the tickets selected by their broader parent
types.
2.36 CycleSoft Set
A CycleSoft Set extends Soft Sets by organizing parameters in a cycle graph, mapping cycle
subgraphs to subsets of a universal set for structured decision-making [155].
Definition 2.36.1 (CycleSoft Set). [155] Let U be a universal set and let C = (A, EC ) be a
cycle graph, where A is a set of parameters arranged in a cycle and
EC = {(ai , ai+1 ) | ai , ai+1 ∈ A} ∪ {(an , a1 )}
describes the cyclic adjacency among the parameters. Define the power set of C as
P(C) = {H | H is a subgraph of C}.
A CycleSoft Set is a mapping
F : P(C) → P(U ),
where for each subgraph H ∈ P(C), F (H) ⊆ U represents the set of objects associated with the
combination of parameters corresponding to H . A common aggregation is to define, for each H ,
\
F (H)(x) =
f (a)(x),
a∈V (H)
with f (a) : U → [0, 1] (or characteristic functions in the crisp case).
Example 2.36.2 (Real-life example of a CycleSoft Set: selecting restaurants by cyclic service
attributes). Let U be a set of restaurants:
U = {r1 , r2 , r3 , r4 , r5 , r6 }.
Consider four service-related parameters arranged in a cycle:
A = {a1 , a2 , a3 , a4 },
61


# Page. 63

![Page Image](https://bcdn.docswell.com/page/GJWGXZZZ72.jpg)

Chapter 2. Types of Soft Set
where
a1 = “good food”,
a2 = “good service”,
a3 = “clean”,
a4 = “good value”.
Let C = (A, EC ) be the cycle graph with edges
EC = {(a1 , a2 ), (a2 , a3 ), (a3 , a4 ), (a4 , a1 )}.
Let P(C) denote the family of all subgraphs of C .
First define a (crisp) soft mapping f : A → P(U ) by
f (a1 ) = {r1 , r2 , r5 },
f (a2 ) = {r1 , r3 , r5 , r6 },
f (a3 ) = {r2 , r3 , r4 , r5 },
f (a4 ) = {r1 , r2 , r4 }.
For a subgraph H ∈ P(C), define the CycleSoft mapping F : P(C) → P(U ) by
\
F (H) :=
f (a),
a∈V (H)
with the convention F (∅) = U .
For example, let H1 be the path subgraph with vertices {a1 , a2 , a3 } and edges {(a1 , a2 ), (a2 , a3 )}.
Then
F (H1 ) = f (a1 ) ∩ f (a2 ) ∩ f (a3 ) = {r1 , r2 , r5 } ∩ {r1 , r3 , r5 , r6 } ∩ {r2 , r3 , r4 , r5 } = {r5 }.
Thus r5 is the (unique) restaurant satisfying the combined cyclic attribute pattern “good food
→ good service → clean”.
Hence F is a CycleSoft Set on U indexed by subgraphs of the cycle C .
2.37 ClusterSoft Set
A ClusterSoft Set groups multiple Soft Sets, capturing relationships among clustered attributes
and mapping them to subsets of a universal set for decision modeling [155].
Definition 2.37.1 (ClusterSoft Set). [155] Let {Fi }i∈I be a finite family of soft sets over a
universe U , where each soft set Fi is a mapping
Fi : Ai → P(U )
for some set of attributes Ai . Suppose the index set I is partitioned into clusters {Cj }j∈J with
each Cj ⊆ I and Cj ∩ Ck = ∅ for j 6= k . A ClusterSoft Set is defined as a mapping
G : {Cj : j ∈ J} → P(U )
given by
G(Cj ) =
[
Fi∗ (Ai ),
i∈Cj
where Fi∗ (Ai ) denotes the set of objects in U associated with soft set Fi (possibly after appropriate aggregation or normalization). The union is taken in the usual set-theoretic sense.
62


# Page. 64

![Page Image](https://bcdn.docswell.com/page/4EZL611L73.jpg)

Chapter 2. Types of Soft Set
Example 2.37.2 (Real-life example of a ClusterSoft Set: grouping customer segments from
multiple marketing soft sets). Let U be a set of customers of an online shop:
U = {u1 , u2 , u3 , u4 , u5 , u6 , u7 , u8 }.
Assume the marketing team maintains a finite family of soft sets {Fi }i∈I , each capturing customers associated with certain attributes in a specific campaign. Let
I = {1, 2, 3, 4}.
Define, for each i ∈ I , a parameter set Ai and a soft mapping Fi : Ai → P(U ).
Campaign 1 (sports campaign). Let A1 = {aRun , aGym } and suppose
F1 (aRun ) = {u1 , u3 , u6 },
F1 (aGym ) = {u2 , u3 , u7 }.
Campaign 2 (outdoor campaign). Let A2 = {aHike } and suppose
F2 (aHike ) = {u1 , u4 , u6 , u8 }.
Campaign 3 (family campaign). Let A3 = {aKids } and suppose
F3 (aKids ) = {u2 , u5 , u7 }.
Campaign 4 (premium campaign). Let A4 = {aPremium } and suppose
F4 (aPremium ) = {u3 , u4 , u8 }.
For each soft set Fi , define the associated customer set
[
Fi∗ (Ai ) :=
Fi (a) ⊆ U.
a∈Ai
Then
F1∗ (A1 ) = {u1 , u2 , u3 , u6 , u7 },
F3∗ (A3 ) = {u2 , u5 , u7 },
F2∗ (A2 ) = {u1 , u4 , u6 , u8 },
F4∗ (A4 ) = {u3 , u4 , u8 }.
Now partition the index set I into clusters (segments):
CActive = {1, 2},
{CActive , CLifestyle } is a partition of I.
CLifestyle = {3, 4},
Define the ClusterSoft mapping G : {CActive , CLifestyle } → P(U ) by
[
G(C) =
Fi∗ (Ai ).
i∈C
Hence
G(CActive ) = F1∗ (A1 ) ∪ F2∗ (A2 ) = {u1 , u2 , u3 , u4 , u6 , u7 , u8 },
G(CLifestyle ) = F3∗ (A3 ) ∪ F4∗ (A4 ) = {u2 , u3 , u4 , u5 , u7 , u8 }.
Thus G is a ClusterSoft Set: each cluster aggregates the customers associated with the soft sets
belonging to that cluster, producing unified target segments for marketing actions.
63


# Page. 65

![Page Image](https://bcdn.docswell.com/page/Y76W2LLM7V.jpg)

Chapter 2. Types of Soft Set
2.38 Soft Expert Set
A soft expert set incorporates expert participation and their opinions: parameter–expert–opinion
triples are mapped to subsets of U , equivalently yielding a mapping from E × X × O into P(U )
[156–159]. As extensions, fuzzy soft expert sets [160,161], neutrosophic soft expert sets [162–164],
and HyperSoft expert sets [25, 135, 165] are also known.
Definition 2.38.1 (Soft Expert Set). [156–159] Let U be a universe, E a set of parameters,
X a set of experts, and O = {0, 1} a set of opinions. Put Z := E × X × O and let A ⊆ Z be
nonempty. A soft expert set on U is a pair (G, A) where
G : A −→ P(U )
assigns to each triple α = (e, x, o) ∈ A a subset G(α) ⊆ U . Thus G(α) collects the elements of
U supported by expert x, with opinion o, under parameter e.
Example 2.38.2 (Real-life example of a soft expert set: smartphone selection by multiple
experts). Let U be a set of smartphone models:
U = {s1 , s2 , s3 , s4 , s5 }.
Let E be a set of evaluation parameters:
E = {eCam , eBatt , ePrice },
where eCam = “good camera”, eBatt = “long battery life”, and ePrice = “affordable price”.
Let X be a set of experts:
X = {x1 , x2 , x3 },
where x1 is a reviewer, x2 is an engineer, and x3 is a budget-conscious user. Let O = {0, 1} be
the set of opinions, where 1 means approve and 0 means disapprove. Set Z = E × X × O and
choose A = Z (i.e., all expert–parameter–opinion triples are allowed).
Define a mapping G : A → P(U ) by listing, for each triple (e, x, o), the phones receiving opinion
o from expert x under parameter e. For example, suppose:
G(eCam , x1 , 1) = {s1 , s3 },
G(eCam , x1 , 0) = {s2 , s4 , s5 },
G(eBatt , x2 , 1) = {s2 , s3 , s5 },
G(eBatt , x2 , 0) = {s1 , s4 },
G(ePrice , x3 , 1) = {s2 , s4 },
G(ePrice , x3 , 0) = {s1 , s3 , s5 },
and define G(e, x, o) similarly for all remaining triples in A.
Then (G, A) is a soft expert set on U : each triple (e, x, o) determines the subset of smartphones
supported (if o = 1) or rejected (if o = 0) by expert x with respect to parameter e. This
structure supports aggregation rules such as selecting phones approved by a majority of experts
under key parameters.
64


# Page. 66

![Page Image](https://bcdn.docswell.com/page/G75M211Q74.jpg)

Chapter 2. Types of Soft Set
2.39 Soft Rough Set
A soft rough set approximates a subset using a soft set: lower includes certainly contained
elements, upper includes possibly related elements [16, 166]. Related notions such as fuzzy soft
rough sets [167–169] and neutrosophic soft rough sets [166, 170, 171] are also known.
Definition 2.39.1 (Soft Rough Set). [16, 166] Let S = (F, A) be a soft set over a universe U ,
and let P = (U, S) be the corresponding soft approximation space. For X ⊆ U , the soft P -lower
and soft P -upper approximations of X are defined by
aprP (X) = { u ∈ U : ∃a ∈ A such that u ∈ F (a) ⊆ X} ,
aprP (X) = { u ∈ U : ∃a ∈ A such that u ∈ F (a) and F (a) ∩ X 6= ∅} .
The pair (aprP (X), aprP (X)) is called the soft rough set of X with respect to P .
Example 2.39.2 (Real-life example of a soft rough set: identifying “reliable suppliers” from
parameterized checklists). Let U be a set of candidate suppliers:
U = {s1 , s2 , s3 , s4 , s5 }.
Let A be a set of audit parameters (checklists):
A = {aISO , aOnTime , aLowDefect }.
Define a soft set S = (F, A) over U by
F (aISO ) = {s1 , s2 , s4 } (ISO-certified suppliers),
F (aOnTime ) = {s1 , s3 , s4 } (historically on-time suppliers),
F (aLowDefect ) = {s2 , s4 , s5 } (low defect-rate suppliers).
Let P = (U, S) be the induced soft approximation space.
Suppose the procurement team proposes a target set
X = {s1 , s4 } ⊆ U
of preferred suppliers (e.g., shortlisted by a manager).
Lower approximation. An element u ∈ U belongs to aprP (X) if there exists a parameter
a ∈ A such that u ∈ F (a) ⊆ X . Here,
F (aISO ) = {s1 , s2 , s4 } * X,
F (aOnTime ) = {s1 , s3 , s4 } * X,
F (aLowDefect ) = {s2 , s4 , s5 } * X.
Thus no F (a) is contained in X , and hence
aprP (X) = ∅.
Upper approximation. An element u ∈ U belongs to aprP (X) if there exists a ∈ A such that
u ∈ F (a) and F (a) ∩ X =
6 ∅. Since
F (aISO )∩X = {s1 , s4 } 6= ∅,
F (aOnTime )∩X = {s1 , s4 } 6= ∅,
F (aLowDefect )∩X = {s4 } 6= ∅,
every supplier appearing in at least one of these F (a) sets is possibly compatible with X .
Therefore,
aprP (X) = F (aISO ) ∪ F (aOnTime ) ∪ F (aLowDefect ) = {s1 , s2 , s3 , s4 , s5 } = U.
Hence the soft rough set of X with respect to P is

aprP (X), aprP (X) = (∅, U ).
Interpretation: based on these coarse checklists, no supplier is certainly in the preferred set,
while every supplier is possibly related to it through at least one parameter block.
65


# Page. 67

![Page Image](https://bcdn.docswell.com/page/9J29411WER.jpg)

Chapter 2. Types of Soft Set
2.40 Weighted Soft Set
A weighted soft set maps each parameter to a universe subset and assigns a weight reflecting
that parameter’s relative significance [172–174].
Definition 2.40.1 (Weighted Soft Set). [174] Let U be a finite universe of discourse and let E
be a finite set of parameters (attributes). A finite weighted soft set over U is defined as a triple
(µ, A, Q),
where:
• A ⊆ E is a finite set of selected parameters,
• µ : A → P(U ) is a mapping that assigns to each parameter a ∈ A a subset µ(a) ⊆ U , and
• Q : A → [0, 1] is a finite weight function that assigns to each parameter a ∈ A a weight
Q(a), representing its relative importance.
Thus, the finite weighted soft set can be represented as
(µ, A, Q) = {(a, µ(a), Q(a)) | a ∈ A}.
Example 2.40.2 (Real-life example of a weighted soft set: selecting apartments with weighted
criteria). Let U be a finite set of apartments:
U = {h1 , h2 , h3 , h4 , h5 }.
Let E be a finite set of decision parameters and choose
A = {aRent , aComm , aSafe } ⊆ E,
where aRent = “affordable rent”, aComm = “short commute”, and aSafe = “safe neighborhood”.
Define the soft mapping µ : A → P(U ) by listing apartments that satisfy each criterion:
µ(aRent ) = {h1 , h3 , h5 },
µ(aComm ) = {h2 , h3 , h4 },
µ(aSafe ) = {h1 , h2 , h4 }.
Assign a weight function Q : A → [0, 1] reflecting the user’s priorities:
Q(aRent ) = 0.50,
Q(aComm ) = 0.30,
Q(aSafe ) = 0.20.
Then (µ, A, Q) is a weighted soft set over U .
Interpretation: affordability is the most important criterion (weight 0.50), followed by commute
time and safety. A simple weighted choice score for an apartment h ∈ U can be formed by
summing the weights of parameters that accept h, e.g.,
X
Score(h) :=
Q(a),
a∈A
h∈µ(a)
so that higher scores indicate better overall fit to the weighted criteria.
66


# Page. 68

![Page Image](https://bcdn.docswell.com/page/DEY4MZZ9JM.jpg)

Chapter 2. Types of Soft Set
2.41 Other Soft Set
In this section, we briefly describe several other variants of soft sets.
• Geometric soft sets [175]: Geometric soft sets map parameters to point-incidence subsets,
capturing hyperplane or distance relations, enabling geometric realization and network
analysis efficiently.
• Multiparameterized soft sets [176, 177]: A multiparameterized soft set maps each multiparameter tuple to a subset of the universe, capturing combined attribute approximations
effectively simultaneously.
• Concave Soft sets [178–180]: Concave soft sets are order-preserving: if x ≤ y then F (x) ⊆
F (y), equivalently F coincides with its closure [[F ]] for the given order.
67


# Page. 69

![Page Image](https://bcdn.docswell.com/page/VJNYW33D78.jpg)



# Page. 70

![Page Image](https://bcdn.docswell.com/page/YE9PX998J3.jpg)

Chapter 3
Uncertain Soft Theory
In this chapter, we examine uncertainty-aware soft set models, including fuzzy soft sets and
neutrosophic soft sets.
3.1 Fuzzy Soft Set
A fuzzy soft set assigns each parameter a fuzzy subset of the universe, giving graded membership
degrees for objects under parameters [181–183]. Related notions include fuzzy HyperSoft sets
[125, 184, 185] and neutrosophic HyperSoft sets [186–189].
Definition 3.1.1 (Fuzzy Soft Set). [181–183] Let U be a nonempty universe and E a set of
parameters. Write F(U ) := {µ : U → [0, 1]} for the family of all fuzzy subsets of U . For a fixed
subset A ⊆ E , a fuzzy soft set over U (with respect to A) is a pair
(ΓA , A),
ΓA : A −→ F(U ), x 7−→ µx (·),
so that it can be represented as the collection
ΓA = { (x, µx ) | x ∈ A, µx : U → [0, 1] }.
For u ∈ U and x ∈ A, the value µx (u) is the degree to which u approximately satisfies the
e : E → F(U ) with Γ(x)
e
parameter x. Equivalently, one may specify a map Γ
= 0 (the zero
membership function) for all x ∈
/ A.
3.2 Intuitionistic Fuzzy Soft Set
An intuitionistic fuzzy soft set assigns each parameter an intuitionistic fuzzy set, specifying
membership and non-membership degrees with hesitation [190–193].
69


# Page. 71

![Page Image](https://bcdn.docswell.com/page/GE8D299ZED.jpg)

Chapter 3. Uncertain Soft Theory
Definition 3.2.1 (Intuitionistic fuzzy set (Atanassov)). [5,194] Let U be a nonempty universe.
An intuitionistic fuzzy set (IFS) A on U is specified by a pair of functions
µA : U → [0, 1],
νA : U → [0, 1],
called the membership and non-membership functions, respectively, such that
0 ≤ µA (u) + νA (u) ≤ 1
for all u ∈ U.
The hesitation (indeterminacy) degree is then
πA (u) := 1 − µA (u) − νA (u) ∈ [0, 1].
We write IFS(U ) for the class of all intuitionistic fuzzy sets on U .
Definition 3.2.2 (Intuitionistic fuzzy soft set). [190, 191] Let U be a nonempty universe and
let E be a nonempty set of parameters. An intuitionistic fuzzy soft set (IFSS) over U (with
parameter set E ) is a pair
(Fe , A),
A ⊆ E,
where
Fe : A −→ IFS(U )
is a mapping. Equivalently, an IFSS can be represented as a family of ordered pairs

(Fe , A) = (e, Fe (e)) e ∈ A, Fe (e) ∈ IFS(U ) .
If one prefers to keep the full parameter set E as the domain, one may extend Fe to a mapping
e defined by
Fe : E → IFS(U ) by setting, for each e ∈ E \ A, the null IFS Fe (e) = ∅
µ∅
e (u) = 0,
ν∅
e (u) = 1 (∀u ∈ U ),
which matches the common convention in the literature.
3.3
Neutrosophic Soft Set
A neutrosophic set assigns each element independent truth, indeterminacy, and falsity degrees,
generalizing fuzzy and intuitionistic fuzzy sets [8, 9]. The Neutrosophic Soft Set is a concept
that combines the principles of Neutrosophic Sets and Soft Sets [137, 195–201]. The definition is
provided below.
Definition 3.3.1 (Neutrosophic Soft Set [202, 203]). Let U be a universe and
 E a set of parameters. A Neutrosophic Soft Set (NSS) over U is defined as a pair F, A , where A ⊆ E
and
F : A −→ P (U ),
with P (U ) being the collection of Neutrosophic Sets on U . Hence for each parameter e ∈ A,

F (e) = TF (e) , IF (e) , FF (e)
is a Neutrosophic Set on U , satisfying
0 ≤ TF (e) (x) + IF (e) (x) + FF (e) (x) ≤ 3,
70
∀ x ∈ U.


# Page. 72

![Page Image](https://bcdn.docswell.com/page/LELM2WW17R.jpg)

Chapter 3. Uncertain Soft Theory
3.4 Plithogenic Soft Set
A plithogenic set models multi-attribute membership using contradiction degrees among attribute values, generalizing fuzzy, intuitionistic, and neutrosophic sets [14, 204]. A plithogenic
soft set maps attribute-value tuples to plithogenic evaluations and corresponding subsets, incorporating contradiction profiles relative to dominant values [205, 206].
Definition 3.4.1 (Plithogenic soft set). [205, 206] Let U be a universe of discourse and let
z ∈ {C, F, IF, N }. We write Pz (U ) for the z -power set of U , defined by
PC (U ) := P(U ),
PIF (U ) :=
n
PF (U ) := { µ : U → [0, 1] },
o
0 ≤ µ(u) + ν(u) ≤ 1 (∀u ∈ U ) ,
(µ, ν) : U → [0, 1]2
and let PN (U ) denote the family of (single-valued) neutrosophic sets on U (i.e., triples (T, I, F ) :
U → [0, 1]3 satisfying the usual neutrosophic constraints, according to the chosen convention).
Let a1 , a2 , . . . , an (n ≥ 1) be distinct attributes with corresponding (pairwise disjoint) value sets
V1 , V2 , . . . , Vn such that Vi ∩ Vj = ∅ for i =
6 j . Set
Υ := V1 × V2 × · · · × Vn .
Fix a dominant value vector D = (D1 , . . . , Dn ) ∈ Υ and, for each i ∈ {1, . . . , n}, a contradiction
degree function
ci : Vi × Vi −→ [0, 1]
satisfying
ci (v, v) = 0, ci (v, w) = ci (w, v).
Define
[0, 1]D := [0, 1]n
and, for υ = (v1 , . . . , vn ) ∈ Υ, define the contradiction vector relative to D by
cD (υ) :=

c1 (D1 , v1 ), c2 (D2 , v2 ), . . . , cn (Dn , vn ) ∈ [0, 1]D .
A z -plithogenic soft set (briefly, plithogenic soft set) over U is a pair (FPz , Υ) where
FPz : Υ −→ [0, 1]D × Pz (U ).
Equivalently, for each υ ∈ Υ we can write

FPz (υ) = cD (υ), Sυ ,
with Sυ ∈ Pz (U ),
so that each attribute-value tuple υ is assigned (i) its contradiction profile relative to the dominant tuple D, and (ii) a z -valued subset Sυ of the universe U .
71


# Page. 73

![Page Image](https://bcdn.docswell.com/page/4JMY8995JW.jpg)

Chapter 3. Uncertain Soft Theory
3.5
Uncertain Soft Set
An Uncertain Set is any set-theoretic model assigning graded, possibly multi-component membership degrees to elements, generalizing fuzzy, intuitionistic, neutrosophic, plithogenic and related
uncertainty frameworks unified [207–209].
Definition 3.5.1 (Uncertain Set). [207] Let U be the collection of all Uncertain Models. Fix
Dom(U ) ⊆ [0, 1]r
U ∈ U,
for some integer r ≥ 1, and let X be a nonempty base set (universe of discourse).
An Uncertain Set of type U on X is a pair
AU = (X, µ),
where
µ : X −→ Dom(U )
assigns to each element x ∈ X a U –membership degree
µ(x) ∈ Dom(U ).
Equivalently, once the base set X and the Uncertain Model U are fixed, we may identify the
Uncertain Set with its membership function and simply write
AU : X −→ Dom(U ),
x 7−→ µ(x),
and view the collection of all Uncertain Sets of type U on X as the function space
X
Dom(U )
= { µ | µ : X → Dom(U ) }.
In this sense, an Uncertain Set is a U –labeling of the base set X by membership–degree tuples
taken from Dom(U ).
Remark 3.5.2 (Recovery of classical fuzzy–type sets). Let X be a nonempty set and let AU =
(X, µ) be an Uncertain Set of type U .
1. (Fuzzy Set) Take U = Fuzzy with
Dom(U ) = [0, 1] = [0, 1]1 .
Then an Uncertain Set of type U is exactly a fuzzy set in the sense of Zadeh, since
µ : X → [0, 1]
is the usual fuzzy membership function.
72


# Page. 74

![Page Image](https://bcdn.docswell.com/page/PJR95GGZ79.jpg)

Chapter 3. Uncertain Soft Theory
2. (Intuitionistic Fuzzy Set) Take U = Intuitionistic Fuzzy with

Dom(U ) = (µ, ν) ∈ [0, 1]2 | µ + ν ≤ 1 ⊆ [0, 1]2 .
Then AU = (X, µ) coincides with an intuitionistic fuzzy set, because for each x ∈ X ,

µ(x) = µA (x), νA (x) ∈ [0, 1]2
satisfies µA (x) + νA (x) ≤ 1.
3. (Neutrosophic Set) Take U = Neutrosophic with

Dom(U ) = (T, I, F ) ∈ [0, 1]3 | 0 ≤ T + I + F ≤ 3 ⊆ [0, 1]3 .
Then AU = (X, µ) is exactly a single–valued neutrosophic set, since

µ(x) = TA (x), IA (x), FA (x) ∈ [0, 1]3
with 0 ≤ TA (x) + IA (x) + FA (x) ≤ 3 for all x ∈ X .
4. (Plithogenic Set) For a Plithogenic Model U = Plithogenic with degree–domain
n

Dom(U ) =
v, pdf(x, v), pCF(v1 , v2 )
o
v ∈ Pv , pdf(x, v) ∈ [0, 1]s , pCF(v1 , v2 ) ∈ [0, 1]t ⊆ [0, 1]s+t+` ,
an Uncertain Set of type U on X reproduces a Plithogenic Set on X , since each
µ(x) ∈ Dom(U )
encodes the Plithogenic degrees associated with x ∈ X .
Thus, by choosing different Uncertain Models U ∈ U and their corresponding domains Dom(U ) ⊆
[0, 1]r , the general notion of an Uncertain Set in Definition 3.5.1 unifies fuzzy sets, intuitionistic
fuzzy sets, neutrosophic sets, plithogenic sets, and many other existing uncertainty–set frameworks.
Definition 3.5.3 (Uncertainty domain and D-uncertain sets). Let U be a nonempty universe.
Fix an integer r ≥ 1 and a nonempty set (degree-domain)
D ⊆ [0, 1]r .
A D-uncertain set on U is a mapping
µ : U −→ D.
We denote the class of all D-uncertain sets on U by
UncD (U ) := D U = {µ | µ : U → D}.
Definition 3.5.4 (Uncertain soft set). Let U be a nonempty universe and let E be a nonempty
set of parameters. Fix a nonempty uncertainty domain D ⊆ [0, 1]r as in Definition 3.5.3. Let
A ⊆ E be nonempty.
A D-uncertain soft set (briefly, an uncertain soft set) over U with parameter set A is a pair
(F, A) where
F : A −→ UncD (U ).
Equivalently, for each parameter e ∈ A, the value F (e) is a D-uncertain set on U , i.e., a function
F (e) : U −→ D,
u 7−→ F (e)(u) ∈ D.
73


# Page. 75

![Page Image](https://bcdn.docswell.com/page/PEXQKXX1JX.jpg)

Chapter 3. Uncertain Soft Theory
Remark 3.5.5 (Two standard choices of D). (i) Fuzzy domain: DFuz := [0, 1] ⊆ [0, 1]1 .
(ii) Single-valued neutrosophic domain:

DNeu := (t, i, f ) ∈ [0, 1]3
0 ≤ t + i + f ≤ 3 ⊆ [0, 1]3 .
(If one adopts a different conventional constraint for neutrosophic triples, replace DNeu accordingly; the theorem below remains valid.)
Theorem 3.5.6 (Uncertain soft sets generalize fuzzy and neutrosophic soft sets). Let U be a
universe, E a parameter set, and A ⊆ E nonempty.
(1) (Fuzzy soft sets as a special case) If D = DFuz = [0, 1], then D-uncertain soft sets (F, A)
are exactly fuzzy soft sets over U with parameter set A.
(2) (Neutrosophic soft sets as a special case) If D = DNeu , then D-uncertain soft sets (F, A)
are exactly (single-valued) neutrosophic soft sets over U with parameter set A.
Proof. (1) Assume D = [0, 1]. Then
UncD (U ) = D U = [0, 1]U ,
the set of all membership functions on U . Hence an uncertain soft set is a map
F : A → [0, 1]U .
For each e ∈ A, put µe := F (e) ∈ [0, 1]U . Thus (F, A) assigns to each parameter e a fuzzy
subset µe of U , which is precisely the definition of a fuzzy soft set. Conversely, any fuzzy soft
set (Γ, A) with Γ(e) = µe : U → [0, 1] is exactly a map Γ : A → [0, 1]U = UncD (U ), hence a
D-uncertain soft set. Therefore the two notions coincide when D = [0, 1].
(2) Assume D = DNeu ⊆ [0, 1]3 . Then UncD (U ) = D U is the class of all functions
U → DNeu ,
u 7→ (T (u), I(u), F (u)),
i.e., the class of (single-valued) neutrosophic sets on U under the chosen constraint. An uncertain
soft set is a map
U
F : A → DNeu
,
so each e ∈ A is assigned a neutrosophic set F (e) on U . This is exactly the definition of a (singlevalued) neutrosophic soft set. Conversely, any neutrosophic soft set (G, A) is by definition a map
U
G : A → DNeu
= UncD (U ), hence a D-uncertain soft set. Thus the two notions coincide when
D = DNeu .
Remark 3.5.7 (Optional extension: parameter-dependent uncertainty types). If one wishes to
allow different uncertainty domains per parameter, one may fix a family {De ⊆ [0, 1]re }e∈A and
define an “untyped” uncertain soft set as a map F (e) : U → De . The specialization arguments
in Theorem 3.5.6 then apply componentwise.
74


# Page. 76

![Page Image](https://bcdn.docswell.com/page/3EK95WWMED.jpg)

Chapter 3. Uncertain Soft Theory
3.6 Z-Soft Set
A Z-soft set maps each parameter and object to a Z-number, combining fuzzy assessment with
fuzzy reliability information.
e : [0, 1] →
Definition 3.6.1 (Fuzzy number). [210,211] A fuzzy number on [0, 1] is a fuzzy set A
[0, 1] such that:
e is normal: sup
e
1. A
x∈[0,1] A(x) = 1;
e is convex: for every α ∈ (0, 1], the α-cut
2. A
e α := {x ∈ [0, 1] | A(x)
e
[A]
≥ α}
is a (possibly degenerate) closed interval;
e is upper semicontinuous and has compact support in [0, 1].
3. A
We write FN([0, 1]) for the family of all fuzzy numbers on [0, 1].
Definition 3.6.2 (Z-number). [17] A Z-number is an ordered pair
e B),
e
Z = (A,
e ∈ FN([0, 1]) represents a fuzzy restriction (e.g., a fuzzy assessment of a degree), and
where A
e ∈ FN([0, 1]) represents the reliability (certainty) of A
e. Let
B
Z := FN([0, 1]) × FN([0, 1])
denote the set of all such Z-numbers.
Definition 3.6.3 (Z-valued fuzzy set). Let U be a nonempty universe. A Z-valued fuzzy set on
U is a mapping
µ : U −→ Z.
We denote the family of all Z-valued fuzzy sets on U by
ZFS(U ) := Z U .
Definition 3.6.4 (Z-soft set). Let U be a nonempty universe and let E be a nonempty set of
parameters. Let A ⊆ E be nonempty. A Z-soft set over U with parameter set A is a pair (F, A)
where
F : A −→ ZFS(U ).
Equivalently, (F, A) may be identified with a single mapping
µ : A × U −→ Z,

ee,u , B
ee,u ,
µ(e, u) = A
ee,u encodes the (fuzzy) assessment of u under parameter e, and B
ee,u encodes the reliawhere A
bility of that assessment.
75


# Page. 77

![Page Image](https://bcdn.docswell.com/page/L73WK11275.jpg)

Chapter 3. Uncertain Soft Theory
3.7
Functorial Soft Set
Functorial Sets provide an additional categorical layer: they organize such assignments across
objects and morphisms so that structure transport is systematic and compositional [207, 212].
A functorial soft set treats parameters as objects in a category, assigning subsets functorially so
morphisms induce consistent transformations naturally.
Definition 3.7.1 (Functorial Set). [207] Let C be a category and let
F : C −→ Set
be a covariant functor. The pair (C, F ) is called a Functorial Set. For each object X ∈ Ob(C),
the set F (X) is interpreted as the collection of F -structures attached to X . Every morphism
f : X → Y induces a structure-preserving map
F (f ) : F (X) −→ F (Y ),
such that F (idX ) = idF (X) and
F (g ◦ f ) = F (g) ◦ F (f )
for all composable morphisms f, g in C .
Definition 3.7.2 (Covariant power-set functor). Let Set be the category of sets and functions.
The covariant power-set functor
P : Set −→ Set
is defined on objects by P(X) = {A | A ⊆ X} and on morphisms f : X → Y by the direct
image map
P(f ) : P(X) −→ P(Y ),
P(f )(A) := f [A] = {f (x) | x ∈ A}.
Then P(idX ) = idP(X) and P(g ◦ f ) = P(g) ◦ P(f ).
Definition 3.7.3 (Functorial soft set). Let C be a category. Let
U : C −→ Set and E : C −→ Set
be covariant functors, interpreted as a universe functor and a parameter functor, respectively.
A functorial soft set on (C, U, E) is a natural transformation
F : E =⇒ P ◦ U.
Equivalently, it is the data of maps

FX : E(X) −→ P U(X)
(X ∈ Ob(C))
such that for every morphism f : X → Y in C the following naturality condition holds:

P U(f ) ◦ FX = FY ◦ E(f ).
In elementwise form, for all e ∈ E(X),



U(f ) FX (e) = FY E(f )(e) ⊆ U(Y ).
76


# Page. 78

![Page Image](https://bcdn.docswell.com/page/87DK3XX6JG.jpg)

Chapter 3. Uncertain Soft Theory
Theorem 3.7.4 (Functorial soft sets generalize soft sets and functorial sets).(1) (Soft sets as
a special case) Let C be the terminal category with one object ∗ and only id∗ . Let U, E :
C → Set be constant functors with U(∗) = U and E(∗) = E . Then every functorial soft set
F : E ⇒ P ◦ U is uniquely the same as a classical soft set mapping
F : E −→ P(U ).
(2) (Functorial sets embed into functorial soft sets) Let (C, F ) be a functorial set, i.e.,
F : C → Set. Define U := F and E := F , and define components

FX : F (X) −→ P F (X) ,
FX (x) := {x}.
Then F : E ⇒ P ◦ U is a natural transformation, hence a functorial soft set. Moreover, from
this functorial soft set one recovers the original functorial set by taking either U or E (both
equal F ).
Proof. (1) In the terminal category C , a functor U : C → Set is determined by a single set
U := U(∗), and likewise E is determined by a single set E := E(∗). A natural transformation
F :E ⇒P ◦U
is determined by its single component
F∗ : E −→ P(U ).
Since the only morphism is id∗ , the naturality condition is automatic. Thus, setting F := F∗
yields precisely a soft set mapping F : E → P(U ), and conversely any such mapping defines a
unique natural transformation. Hence the notions coincide.
(2) Let f : X → Y be a morphism in C and let x ∈ F (X). Compute the left-hand side of
naturality:



P U(f ) FX (x) = P F (f ) ({x}) = F (f )[{x}] = {F (f )(x)}.
Compute the right-hand side:


FY E(f )(x) = FY F (f )(x) = {F (f )(x)}.
Thus P(U(f )) ◦ FX = FY ◦ E(f ) holds for all f , so F is natural and hence defines a functorial
soft set. Finally, since U = E = F by construction, the original functorial set is recovered
immediately from the functorial soft set.
77


# Page. 79

![Page Image](https://bcdn.docswell.com/page/VJPK4PKZE8.jpg)



# Page. 80

![Page Image](https://bcdn.docswell.com/page/2EVVX2VMEQ.jpg)

Chapter 4
Applications of Soft Set
In this chapter, we describe several studies on extended concepts developed using soft set theory.
4.1 Soft Graph
A soft graph assigns each parameter a subgraph of a fixed graph, modeling uncertain, parameterized relationships among vertices and edges [213,214]. As extensions, concepts such as HyperSoft
Graphs [215–217], Fuzzy Soft Graphs [218, 219], Neutrosophic Soft Graphs [220, 221], Soft HyperGraphs [222, 223], Soft SuperHyperGraphs [224], and Soft Directed Graphs [225, 226] have
also been studied.
Definition 4.1.1 (Soft Graph). [213, 214] Let G = (V, E) be a simple undirected graph and C
a nonempty set of parameters. A soft graph over G with parameter set C is a quadruple

G, C, A, B ,
where
A : C −→ P(V ),
B : C −→ P(E),
and for each c ∈ C ,
B(c) ⊆

{u, v} ∈ E : u ∈ A(c), v ∈ A(c) .

The pair A(c), B(c) is called the soft subgraph at parameter c.
Example 4.1.2 (Example of a soft graph: a friendship network under different interaction
contexts). Let G = (V, E) be a simple undirected graph modeling friendships among six people:
V = {v1 , v2 , v3 , v4 , v5 , v6 },

E = {v1 , v2 }, {v1 , v3 }, {v2 , v3 }, {v2 , v4 }, {v3 , v5 }, {v4 , v5 }, {v5 , v6 } .
Let C be a set of parameters describing interaction contexts:
C = {cWork , cSport , cOnline }.
79


# Page. 81

![Page Image](https://bcdn.docswell.com/page/57GLVRLXEL.jpg)

Chapter 4. Applications of Soft Set
Define A : C → P(V ) (active people) and B : C → P(E) (active friendships) by:
A(cWork ) = {v1 , v2 , v3 , v4 },
B(cWork ) = {{v1 , v2 }, {v1 , v3 }, {v2 , v3 }, {v2 , v4 }},
A(cSport ) = {v2 , v3 , v5 , v6 },
B(cSport ) = {{v2 , v3 }, {v3 , v5 }, {v5 , v6 }},
A(cOnline ) = {v1 , v3 , v5 },
B(cOnline ) = {{v1 , v3 }, {v3 , v5 }}.
For each c ∈ C , every edge in B(c) has both endpoints in A(c), hence

B(c) ⊆ {u, v} ∈ E : u, v ∈ A(c) .

Therefore, G, C, A, B is a soft graph over G.
Interpretation: the underlying friendship network is G, while each parameter c extracts a contextdependent subgraph (e.g., work interactions, sports interactions, online interactions).
4.2
Soft Topological Space
A soft topological space is a parameterized family of soft open sets closed under soft unions and
finite intersections, containing the null and absolute soft sets [227, 228]. Related notions include
hypersoft topological spaces [229–231], fuzzy soft topological spaces [232,233], intuitionistic fuzzy
soft topological spaces [234, 235], and neutrosophic soft topological spaces [227, 236–238].
Definition 4.2.1 (Soft topology and soft topological space). [227, 228] Let X be a nonempty
universe and let A be a nonempty set of parameters. A soft set over X (with parameter set A)
is a pair (F, A) where F : A → P(X). Denote by SS(X, A) the collection of all such soft sets.
Define the A-null and A-absolute soft sets by
1A = (F1 , A), F1 (a) = X (a ∈ A).
F
For a family {(Fi , A)}i∈I ⊆ SS(X, A), define the soft union i∈I (Fi , A) = (F, A) by
[
F (a) =
Fi (a)
(a ∈ A),
0A = (F0 , A),
F0 (a) = ∅ (a ∈ A),
i∈I
and for (F, A), (G, A) ∈ SS(X, A) define the soft intersection (F, A) u (G, A) = (H, A) by
H(a) = F (a) ∩ G(a)
(a ∈ A).
A subfamily τ ⊆ SS(X, A) is called a soft topology on X (with parameter set A) if
(ST1) 0A , 1A ∈ τ ;
(ST2) if (F, A), (G, A) ∈ τ , then (F, A) u (G, A) ∈ τ ;
(ST3) if (Fi , A) ∈ τ for all i ∈ I , then
i∈I (Fi , A) ∈ τ .
F
80


# Page. 82

![Page Image](https://bcdn.docswell.com/page/4EQY6VY5JP.jpg)

Chapter 4. Applications of Soft Set
In this case, the triple (X, τ, A) is called a soft topological space. Elements of τ are called soft
open sets; a soft set (F, A) is soft closed if its soft complement (F, A)c belongs to τ .
Example 4.2.2 (Example of a soft topological space: neighborhood-based accessibility under
different criteria). Let
X = {x1 , x2 , x3 }
be a set of locations (e.g., three service points), and let
A = {aWalk , aDrive }
be a set of parameters, where aWalk means “reachable on foot” and aDrive means “reachable by
car”.
Define the following soft sets over X (all with parameter set A):
0A (aWalk ) = 0A (aDrive ) = ∅,
1A (aWalk ) = 1A (aDrive ) = X,
and a nontrivial soft set (F, A) by
F (aWalk ) = {x1 , x2 },
Now define
F (aDrive ) = {x1 , x2 , x3 }.
τ = { 0A , 1A , (F, A) } ⊆ SS(X, A).
We verify that τ is a soft topology on X :
(ST1) 0A , 1A ∈ τ by definition.
(ST2) Finite soft intersections remain in τ :
(F, A) u 1A = (F, A),
(F, A) u 0A = 0A ,
(F, A) u (F, A) = (F, A).
(ST3) Arbitrary soft unions of members of τ remain in τ :
0A
G
(F, A) = (F, A),
(F, A)
G
1A = 1A ,
0A
G
1A = 1A ,
and any union of copies of (F, A) is still (F, A).
Hence (X, τ, A) is a soft topological space.
Interpretation: under the walking criterion, the “open” accessible region is {x1 , x2 }, while under
driving it is X ; the soft topology τ contains the null region, the whole region, and this parameterdependent accessibility region.
81


# Page. 83

![Page Image](https://bcdn.docswell.com/page/KJ4W4MWV71.jpg)

Chapter 4. Applications of Soft Set
4.3
Soft Algebra
A soft algebra is a family of soft sets closed under soft complement and finite soft unions,
containing the null soft set [239–242].
Definition 4.3.1 (Soft algebra). [239, 240] Let X be a nonempty universe and let Q be a
nonempty set of parameters. A soft set over X (with parameter set Q) is a pair (F, Q) where
F : Q −→ P(X).
Denote by SQ (X) the family of all such soft sets.
Define the null soft set ΦQ = (F0 , Q) and the absolute soft set XQ = (F1 , Q) by
F0 (q) = ∅,
(q ∈ Q).
F1 (q) = X
For (F, Q) ∈ SQ (X) define the soft complement (F, Q)c = (F c , Q) by
F c (q) = X \ F (q)
(q ∈ Q).
For (F, Q), (G, Q) ∈ SQ (X) define the soft union and soft intersection by
e (G, Q) = (H, Q), H(q) = F (q)∪G(q),
(F, Q) ∪
e (G, Q) = (K, Q), K(q) = F (q)∩G(q).
(F, Q) ∩
A subfamily Σ ⊆ SQ (X) is called a soft algebra on X (parameterized by Q) if:
(SA1) ΦQ ∈ Σ;
(SA2) if (F, Q) ∈ Σ, then (F, Q)c ∈ Σ;
(SA3) if (Fi , Q) ∈ Σ for i = 1, 2, . . . , k , then
k
[
f
i=1
(Fi , Q) ∈ Σ,
equivalently, Σ is closed under finite soft unions.
Example 4.3.2 (Example of a soft algebra: “fixed-or-empty” soft sets). Let
X = {1, 2, 3, 4} and Q = {q1 , q2 }
be a universe and a parameter set. Fix two subsets of X :
A = {1, 2},
B = {3, 4}.
Consider the following subfamily Σ ⊆ SQ (X) consisting of all soft sets (F, Q) such that for each
parameter q ∈ Q one has
F (q) ∈ {∅, A, B, X}.
Equivalently,
n
Σ = (F, Q) ∈ SQ (X)
o
F (q1 ), F (q2 ) ∈ {∅, A, B, X} .
Then Σ is a soft algebra on X (parameterized by Q):
82


# Page. 84

![Page Image](https://bcdn.docswell.com/page/LE1Y48Y47G.jpg)

Chapter 4. Applications of Soft Set
(SA1) The null soft set ΦQ belongs to Σ since ΦQ (q1 ) = ΦQ (q2 ) = ∅.
(SA2) If (F, Q) ∈ Σ, then for each q ∈ Q we have F (q) ∈ {∅, A, B, X}, and hence
F c (q) = X \ F (q) ∈ {X, B, A, ∅} ⊆ {∅, A, B, X}.
Thus (F, Q)c ∈ Σ.
(SA3) If (F1 , Q), . . . , (Fk , Q) ∈ Σ, then for each q ∈ Q every Fi (q) is one of ∅, A, B, X , and
therefore
k
[
Fi (q) ∈ {∅, A, B, X}.
i=1
Sk
Hence e i=1 (Fi , Q) ∈ Σ.
Interpretation: Σ models a simplified decision system where, for each parameter, only four
outcomes are allowed (select nothing, select group A, select group B , or select all X ), and the
family remains stable under combining rules (union) and negating rules (complement).
4.4 Soft Lattice
A soft lattice is a lattice structure on soft sets (often modulo soft equality), using soft union
and intersection operations [243–245]. Related notions include fuzzy soft lattices [246, 247],
intuitionistic fuzzy soft lattices [248–250], and neutrosophic soft lattices [243].
Definition 4.4.1 (Soft lattice). [243–245] Let U be a nonempty universe and let E be a
nonempty set of parameters. Write
S(U, E) := {(F, A) | A ⊆ E, F : A → P(U )}
for the collection of all (crisp) soft sets over U .
(1) Operations. For (F, A), (G, B) ∈ S(U, E) define the extended union
e (G, B) := (H, A ∪ B),
(F, A) ∪
where for each e ∈ A ∪ B ,


e ∈ A \ B,
 F (e),
H(e) = G(e),
e ∈ B \ A,


F (e) ∪ G(e), e ∈ A ∩ B,
and define the restricted intersection
e (G, B) := (K, A ∩ B),
(F, A) ∩
K(e) = F (e) ∩ G(e) (e ∈ A ∩ B).
(2) Generalized (1-)soft equality. Define a binary relation 1 on S(U, E) by
(F, A) 1 (G, B)
:⇐⇒
A = ∅ or
83

∀e ∈ A ∃e0 ∈ B : F (e) ⊆ G(e0 ) .


# Page. 85

![Page Image](https://bcdn.docswell.com/page/GEWGXZGZJ2.jpg)

Chapter 4. Applications of Soft Set
Define 1-soft equality ≈1 by
(F, A) ≈1 (G, B)
:⇐⇒
(F, A) 1 (G, B) and (G, B) 1 (F, A).
Then ≈1 is an equivalence relation on S(U, E).
(3) The soft lattice (quotient lattice). Let S(U, E)/≈1 be the set of ≈1 -equivalence classes,
and write [(F, A)] for the class of (F, A). Define binary operations ∨, ∧ on S(U, E)/≈1 by




e (G, B) ,
e (G, B) .
[(F, A)] ∨ [(G, B)] := (F, A)∪
[(F, A)] ∧ [(G, B)] := (F, A)∩
A soft lattice (with respect to ≈1 ) is the lattice


S(U, E)/≈1 , ∨, ∧ ,
which satisfies the lattice axioms (commutativity, associativity, idempotency, and absorption) in
the usual sense on equivalence classes.
If one additionally specifies the bottom and top elements (1-null and 1-universal soft sets), then
it becomes a bounded soft lattice.
4.5
Soft Vector
A soft vector is a parameter-indexed mapping into a vector space, representing context-dependent
vectors and enabling membership in soft sets [251].
Definition 4.5.1 (Soft vector (soft element of a vector space)). [251] Let V be a vector space
over a field F, and let A be a nonempty set of parameters. A soft vector over V (with respect
to A) is a mapping
ṽ : A −→ V.
If (F, A) is a soft set over V , i.e. F : A → P(V ), then a soft vector ṽ is said to belong to (F, A)
(denoted ṽ ∈ (F, A)) if
ṽ(a) ∈ F (a)
for all a ∈ A.
Remark 4.5.2. A soft vector ṽ is called constant if there exists v ∈ V such that ṽ(a) = v for
all a ∈ A.
Example 4.5.3 (Example of a soft vector: portfolio weights under different market scenarios).
Let V = R3 be the vector space of portfolio weight vectors for three assets (Asset 1, Asset 2,
Asset 3) over the field R. Let the parameter set represent market scenarios:
A = {aBull , aBase , aBear }.
Define a soft vector ṽ : A → V by assigning a (scenario-dependent) weight vector to each
scenario:
ṽ(aBull ) = (0.50, 0.30, 0.20),
ṽ(aBase ) = (0.40, 0.40, 0.20),
84
ṽ(aBear ) = (0.20, 0.50, 0.30).


# Page. 86

![Page Image](https://bcdn.docswell.com/page/47ZL61LLJ3.jpg)

Chapter 4. Applications of Soft Set
Thus ṽ is a soft vector over V .
Now define a soft set (F, A) over V describing admissible portfolios under each scenario:
F (aBull ) = {(x1 , x2 , x3 ) ∈ R3 | x1 ≥ 0.40, x2 ≤ 0.40, x3 ≤ 0.30},
F (aBase ) = {(x1 , x2 , x3 ) ∈ R3 | 0.30 ≤ x1 ≤ 0.50, 0.30 ≤ x2 ≤ 0.50, x3 = 0.20},
F (aBear ) = {(x1 , x2 , x3 ) ∈ R3 | x1 ≤ 0.30, x2 ≥ 0.40, x3 ≥ 0.20}.
Then ṽ ∈ (F, A) because, for every a ∈ A, the vector ṽ(a) satisfies the corresponding constraints
and hence belongs to F (a).
Interpretation: the soft vector encodes a portfolio recommendation that adapts to market scenarios, and membership in the soft set expresses feasibility of the recommendation under scenariospecific rules.
4.6 Soft functions
A soft function maps soft sets between universes via underlying object and parameter maps,
transporting approximations through images and preimages [252–254].
Definition 4.6.1 (Soft function (soft mapping between soft classes)). Let X and Y be nonempty
universes, and let A and B be nonempty parameter sets. Denote
SS(X, A) := {(F, A) | F : A → P(X)},
SS(Y, B) := {(G, B) | G : B → P(Y )}.
Let u : X → Y and p : A → B be mappings. The soft function induced by (u, p) is the mapping
fpu : SS(X, A) −→ SS(Y, B),
defined as follows.
(1) Image. For (F, A) ∈ SS(X, A), define its image
fpu (F, A) := (F 0 , p(A)),
where F 0 : B → P(Y ) is given, for each b ∈ B , by
[


u F (a) , p−1 (b) ∩ A 6= ∅,

F 0 (b) = a∈p−1 (b)∩A

∅,
otherwise.
(Here u(F (a)) = {u(x) | x ∈ F (a)} ⊆ Y .)
(2) Inverse image. For (G, B) ∈ SS(Y, B), define its inverse image
−1
fpu
(G, B) := (H, p−1 (B)),
where H : A → P(X) is given, for each a ∈ A, by
(

u−1 G(p(a)) , p(a) ∈ B,
H(a) =
∅,
otherwise.
The soft function fpu is called soft injective (resp. soft surjective) if both u and p are injective
(resp. surjective).
85


# Page. 87

![Page Image](https://bcdn.docswell.com/page/YJ6W2LWMJV.jpg)

Chapter 4. Applications of Soft Set
Example 4.6.2 (Example of a soft function induced by (u, p)). Let
X = {x1 , x2 , x3 , x4 } and Y = {y1 , y2 , y3 }
be universes, and let
A = {a1 , a2 , a3 },
B = {b1 , b2 }
be parameter sets.
Define an object-mapping u : X → Y and a parameter-mapping p : A → B by
u(x1 ) = y1 ,
u(x2 ) = y1 ,
p(a1 ) = b1 ,
u(x3 ) = y2 ,
p(a2 ) = b1 ,
u(x4 ) = y3 ,
p(a3 ) = b2 .
Consider the soft set (F, A) ∈ SS(X, A) defined by
F (a1 ) = {x1 , x3 },
F (a2 ) = {x2 },
F (a3 ) = {x4 }.
We compute its image under the soft function fpu . Since p(A) = {b1 , b2 }, we have fpu (F, A) =
(F 0 , p(A)) with F 0 : B → P(Y ) given by
F 0 (b1 ) = u(F (a1 )) ∪ u(F (a2 )) = {u(x1 ), u(x3 )} ∪ {u(x2 )} = {y1 , y2 },
F 0 (b2 ) = u(F (a3 )) = {u(x4 )} = {y3 }.
Hence
fpu (F, A) = (F 0 , {b1 , b2 }),
F 0 (b1 ) = {y1 , y2 }, F 0 (b2 ) = {y3 }.
Next, let (G, B) ∈ SS(Y, B) be given by
G(b1 ) = {y1 },
G(b2 ) = {y2 , y3 }.
−1
Its inverse image under fpu is fpu
(G, B) = (H, p−1 (B)) = (H, A), where
H(a1 ) = u−1 (G(p(a1 ))) = u−1 (G(b1 )) = u−1 ({y1 }) = {x1 , x2 },
H(a2 ) = u−1 (G(p(a2 ))) = u−1 (G(b1 )) = {x1 , x2 },
H(a3 ) = u−1 (G(p(a3 ))) = u−1 (G(b2 )) = u−1 ({y2 , y3 }) = {x3 , x4 }.
Thus fpu transports soft information from (X, A) to (Y, B) by combining parameters via p and
−1
pushing forward object-sets via u, while fpu
pulls soft information back by preimages.
Remark 4.6.3 (A common special case: fixed parameters). If A = B and p = idA , then fpu
reduces to the soft mapping induced only by u:
fu (F, A) = (Fu , A),
Fu (a) = u(F (a)) (a ∈ A).
86


# Page. 88

![Page Image](https://bcdn.docswell.com/page/GJ5M21MQJ4.jpg)

Chapter 4. Applications of Soft Set
4.7 Soft groups
A soft group assigns each parameter a subgroup of a given group, forming a parameterized family
of subgroups [255, 256]. Related notions include hypersoft groups [257], fuzzy soft groups [258],
and neutrosophic soft groups [259].
Definition 4.7.1 (Soft group). [255, 256] Let G be a group with identity element e, and let E
be a nonempty set of parameters. Let A ⊆ E be nonempty. A soft set over G is a pair (F, A)
where
F : A −→ P(G).
The soft set (F, A) is called a soft group over G if
F (a) ≤ G
for all a ∈ A,
i.e., for every parameter a, the value F (a) is a (classical) subgroup of G. Equivalently, a soft
group is a parameterized family of subgroups of G. We may also denote a soft group by the
triple (G, F, A).
Remark 4.7.2 (Standard special cases). Let (F, A) be a soft group over G.
(i) (F, A) is an identity soft group if F (a) = {e} for all a ∈ A.
(ii) (F, A) is an absolute soft group if F (a) = G for all a ∈ A.
Definition 4.7.3 (Soft subgroup). Let (F, A) and (H, B) be soft groups over the same group
e
G. We say that (H, B) is a soft subgroup of (F, A), and write (H, B)≤(F,
A), if
and
B⊆A
H(b) ≤ F (b) for all b ∈ B.
Example 4.7.4 (Example of a soft subgroup). Let G = (Z, +) be the additive group of integers,
and let
E = {a2 , a4 , a6 }
be a set of parameters. Take A = E .
Define a soft group (F, A) over G by assigning, to each parameter, a subgroup of Z:
F (a2 ) = 2Z,
F (a4 ) = 4Z,
F (a6 ) = 6Z.
Each F (ai ) is a subgroup of (Z, +), hence (F, A) is a soft group.
Now take the subset B = {a4 , a6 } ⊆ A and define another soft group (H, B) by
H(a4 ) = 8Z,
H(a6 ) = 12Z.
Again, H(a4 ) and H(a6 ) are subgroups of Z, so (H, B) is a soft group.
Moreover, for each b ∈ B we have subgroup inclusions
H(a4 ) = 8Z ≤ 4Z = F (a4 ),
H(a6 ) = 12Z ≤ 6Z = F (a6 ).
Since also B ⊆ A, it follows that
e (F, A),
(H, B) ≤
i.e., (H, B) is a soft subgroup of (F, A).
87


# Page. 89

![Page Image](https://bcdn.docswell.com/page/LE3WK1WZE5.jpg)

Chapter 4. Applications of Soft Set
4.8
Soft Field
A soft field assigns each parameter a subfield of a fixed field, forming a parameterized family
closed under field operations.
Definition 4.8.1 (Soft field). Let (K, +, ·) be a (crisp) field, let E be a nonempty set of
parameters, and let A ⊆ E be nonempty. A soft field over K (with parameter set A) is a soft
set (F, A) over K , i.e.,
F : A −→ P(K),
such that for every a ∈ A, the set F (a) ⊆ K is a (classical) subfield of K . Equivalently, for each
a ∈ A the following conditions hold:
(i) 0, 1 ∈ F (a) and 1 6= 0;
(ii) if x, y ∈ F (a), then x − y ∈ F (a);
(iii) if x, y ∈ F (a), then x · y ∈ F (a);
(iv) if x ∈ F (a) and x =
6 0, then x−1 ∈ F (a).
Thus, a soft field is a parameterized family of subfields of the fixed ground field K .
Remark 4.8.2 (Related notions).
notion of a soft ring.
(i) Replacing “subfield” by “subring” yields the standard
(ii) In uncertainty-aware extensions, one replaces “subfield” by an appropriate uncertain analogue (e.g., a neutrosophic subfield), obtaining a neutrosophic soft field.
(iii) Some authors also define “soft field” via the soft-element framework as a commutative soft
ring with soft unity in which every nonzero soft element is a soft unit.
4.9
Soft Ring
A soft ring assigns each parameter a subring of a fixed ring, forming a parameterized family
closed under addition and multiplication [260–262].
Definition 4.9.1 (Soft ring). Let (R, +, ·) be a (not necessarily unital) ring, let E be a nonempty
set of parameters, and let A ⊆ E be nonempty. A pair (F, A) is called a soft ring over R if
F : A −→ P(R)
is a mapping such that, for every a ∈ A, the set F (a) ⊆ R is a (classical) subring of R.
Equivalently, for each a ∈ A:
F (a) =
6 ∅,
x − y ∈ F (a) and x · y ∈ F (a) (∀ x, y ∈ F (a)).
Thus a soft ring is a parameterized family of subrings of the fixed ground ring R.
88


# Page. 90

![Page Image](https://bcdn.docswell.com/page/8EDK3XK47G.jpg)

Chapter 4. Applications of Soft Set
Remark 4.9.2. If, additionally, each F (a) is an ideal of R (instead of merely a subring), then
(F, A) is called a soft ideal over R.
Example 4.9.3 (Example of a soft ring). Let R = (Z, +, ·) be the ring of integers, and let
E = {a2 , a3 , a6 }
be a parameter set. Take A = E .
Define a mapping F : A → P(Z) by
F (a2 ) = 2Z,
F (a3 ) = 3Z,
F (a6 ) = 6Z,
where nZ := {nk | k ∈ Z}.
Each F (ai ) is a nonempty subring of Z: if x = nk and y = n` belong to nZ, then
x − y = n(k − `) ∈ nZ,
x · y = n2 k` ∈ nZ.
Therefore (F, A) is a soft ring over Z.
Moreover, each nZ is actually an ideal of Z, so (F, A) is also a soft ideal in the sense of Remark 4.9.2.
4.10 Soft Matroid
A matroid is a combinatorial independence structure generalizing linear independence, defined
via independent sets satisfying hereditary and exchange axioms [263–265]. A soft matroid is
a parameterized matroid-like structure on soft sets, using soft points and exchange axioms for
independence [266, 267].
Definition 4.10.1 (Soft-matroid). Let U be a universal set, E a set of parameters, and A ⊆ E
nonempty. Let FA = (F, A) be a finite soft set over U , i.e., F : A → P(U) and F (e) is finite for
all e ∈ A.
A soft-point of FA is a soft set pex = (P, A) such that P (e) = {x} for some e ∈ A and P (e0 ) = ∅
e FA whenever x ∈ F (e). The cardinality |FA | is the number of
for all e0 ∈ A \ {e}; we write pex ∈
soft-points belonging to FA .
e denote the soft subset relation on soft sets with (possibly) different parameter domains,
Let ⊆
e and e
and let ∪
\ denote the usual soft union and soft difference (defined parameterwise). Let ∅A
be the null soft set on A (i.e., F (e) = ∅ for all e ∈ A).
A soft-matroid is an ordered pair
f = (FA , G),
M
where G is a collection of sub-soft-sets of FA satisfying the following axioms:
89


# Page. 91

![Page Image](https://bcdn.docswell.com/page/V7PK4PKVJ8.jpg)

Chapter 4. Applications of Soft Set
(SM1) (Null axiom) ∅A ∈ G .
e A , then G0A ∈ G .
(SM2) (Hereditary axiom) If GA ∈ G and G0A ⊆G
(SM3) (Exchange axiom) If GA , HA ∈ G with |GA | &lt; |HA |, then there exists a soft-point pex
\GA such that
of HA e
e pex ∈ G.
GA ∪
f is called a soft-matroid on FA .
In this case, M
Example 4.10.2 (Example of a soft-matroid: selecting nonredundant skills across job roles).
Let U be a finite set of skills:
U = {σ1 , σ2 , σ3 , σ4 },
where σ1 = Python, σ2 = Databases, σ3 = Cloud, σ4 = Security. Let A = {e1 , e2 , e3 } ⊆ E be
a set of parameters (job roles), where e1 = Backend, e2 = Data, e3 = DevOps.
Define a finite soft set FA = (F, A) over U by
F (e1 ) = {σ1 , σ2 },
F (e2 ) = {σ1 , σ2 , σ3 },
F (e3 ) = {σ1 , σ3 , σ4 }.
e FA ) exactly when σ ∈ F (e).
A soft-point peσ belongs to FA (i.e., peσ ∈
Let G be the family of all sub-soft-sets GA = (G, A) of FA satisfying the following at most one
skill per role rule:
G(e) ≤ 1 for every e ∈ A.
(Thus, for each role e, either G(e) = ∅ or G(e) = {σ} for some σ ∈ F (e).)
f = (FA , G) is a soft-matroid:
Then M
(i) Null axiom: the null soft set ∅A satisfies |∅A (e)| = 0 ≤ 1 for all e, hence ∅A ∈ G .
e A , then |G0 (e)| ≤ |G(e)| ≤ 1 for all e, so G0A ∈ G .
(i) Hereditary axiom: if GA ∈ G and G0A ⊆G
(i) Exchange axiom: let GA , HA ∈ G with |GA | &lt; |HA | (i.e., HA has more soft-points). Then
there exists some role e∗ ∈ A such that H(e∗ ) 6= ∅ and G(e∗ ) = ∅. Choose the unique σ ∗
∗
with H(e∗ ) = {σ ∗ }, and consider the soft-point peσ∗ . It is a soft-point of HA e
\GA , and adding
it to GA gives
∗
e peσ∗ )(e∗ ) = {σ ∗ },
(GA ∪
∗
e peσ∗ )(e) = G(e) (e =
(GA ∪
6 e∗ ),
e peσ∗ ∈ G .
so the “at most one skill per role” condition still holds. Hence GA ∪
∗
Interpretation: G represents feasible (independent) selections of nonredundant skills across roles,
and the exchange axiom formalizes the ability to extend a smaller feasible assignment by adding
a skill from a larger feasible assignment in a role where nothing was chosen yet.
90


# Page. 92

![Page Image](https://bcdn.docswell.com/page/2JVVX2VRJQ.jpg)

Chapter 4. Applications of Soft Set
4.11 Soft Bitopological Space
A soft bitopological space equips a universe with two soft topologies, supporting dual soft openness and separation analysis simultaneously [268–271].
Definition 4.11.1 (Soft bitopological space). Let X be a nonempty set and let A be a nonempty
set of parameters. Let τ1 and τ2 be two (not necessarily equal) soft topologies on X with respect
to the same parameter set A. Then the quadruple
(X, A, τ1 , τ2 )
is called a soft bitopological space. A soft set (F, A) ∈ τi is called τi -soft open (i = 1, 2), and
(F, A) is τi -soft closed if its soft complement (F, A)c belongs to τi .
Example 4.11.2 (A soft bitopological space for two “notions of openness” in urban accessibility).
Let X be a finite set of city districts (alternatives)
X = {x1 , x2 , x3 , x4 },
and let the parameter set be
A = {Transit, Safety}.
Intuitively, we will model two different criteria for when a family of districts is regarded as “soft
open”: one based on public-transport accessibility and the other based on public safety.
Define the following soft sets over X (with parameter set A):
0A (a) = ∅,
1A (a) = X
(a ∈ A),
and two nontrivial soft sets (F, A) and (G, A) by
F (Transit) = {x1 , x2 },
F (Safety) = {x1 , x3 },
G(Transit) = {x2 , x4 },
G(Safety) = {x3 , x4 }.
Soft topology τ1 (Transit-openness). Let
τ1 := { 0A , 1A , (F, A), (F, A)c }.
Then τ1 is a soft topology on X (with parameter set A), since it contains 0A , 1A , is closed under
finite soft intersections, and under arbitrary soft unions.
Soft topology τ2 (Safety-openness). Let
τ2 := { 0A , 1A , (G, A), (G, A)c }.
Similarly, τ2 is a soft topology on X .
Therefore the quadruple
(X, A, τ1 , τ2 )
is a soft bitopological space in the sense of Definition 4.11.1.
Interpretation. A τ1 -soft open set represents districts that are “open” under the transit-based
viewpoint, while a τ2 -soft open set represents districts that are “open” under the safety-based
viewpoint. For instance, (F, A) ∈ τ1 encodes a transit-favorable selection (parameterwise),
whereas (G, A) ∈ τ2 encodes a safety-favorable selection.
91


# Page. 93

![Page Image](https://bcdn.docswell.com/page/5EGLVRL6JL.jpg)

Chapter 4. Applications of Soft Set
4.12 Soft Module
A soft module is a parameterized family of submodules of a fixed module, enabling contextdependent linear structure modeling [272–276].
Definition 4.12.1 (Soft module). Let R be a ring, let M be a (left) R-module, and let A be a
nonempty set of parameters. A soft set over M with parameter set A is a pair (F, A) where
F : A −→ P(M ).
The soft set (F, A) is called a soft module over M if, for every a ∈ A, the value F (a) is a
(classical) R-submodule of M , i.e.,
F (a) ≤ M
(a ∈ A).
Equivalently, for each a ∈ A:
0M ∈ F (a),
x − y ∈ F (a) and rx ∈ F (a) (∀ x, y ∈ F (a), ∀ r ∈ R).
Thus, a soft module is a parameterized family of submodules of the fixed ground module M .
Example 4.12.2 (A soft module for permission-dependent access subspaces). Let R = R and
let M = R3 be the standard left R-module. Interpret vectors m = (m1 , m2 , m3 ) ∈ M as feature
triples (e.g., three measurable attributes of a record).
Let the parameter set be
A = {public, internal, admin},
representing three access policies. Define a mapping F : A → P(M ) by
F (public) = {(x, 0, 0) : x ∈ R},
F (internal) = {(x, y, 0) : x, y ∈ R},
F (admin) = R3 .
Then each F (a) is an R-submodule of M (indeed, a linear subspace): it contains 0M = (0, 0, 0),
is closed under subtraction, and is closed under scalar multiplication. Hence (F, A) is a soft
module over M in the sense of Definition 4.12.1.
Interpretation. Under public access, only the first coordinate can vary (others must be hidden
as 0); internal access reveals the first two coordinates; and admin access reveals all three. Thus the
admissible information for each policy forms a submodule, and the collection of these submodules
indexed by A is captured by the soft module (F, A).
92


# Page. 94

![Page Image](https://bcdn.docswell.com/page/4JQY6VY27P.jpg)

Chapter 4. Applications of Soft Set
4.13 Soft Metric Space
A soft metric space assigns each parameter a metric on the universe, measuring distances under
varying contexts and uncertainty scenarios [277–280].
Definition 4.13.1 (Soft point and soft element). Let X be a nonempty universe and let E be
a nonempty set of parameters.
(i) A soft point of X is a soft set Pxe = (P, E) such that
P (e) = {x} and P (e0 ) = ∅ (e0 ∈ E \ {e}),
for some (e, x) ∈ E × X .
(ii) A soft element of X (with support A ⊆ E ) is a soft set αA = (α, A) over X such that for
every a ∈ A,
α(a) = {xa } for some xa ∈ X,
and α(e) = ∅ (e ∈ E \ A).
Equivalently, a soft element is a (partial) choice function A → X, a 7→ xa . We write
SE(X, E) for the collection of all soft elements of X whose supports are contained in E .
Definition 4.13.2 (Soft real numbers and order). A soft real number (over E ) is a mapping
e E the set of all nonnegative
r̃ : E → R. It is nonnegative if r̃(e) ≥ 0 for all e ∈ E . Denote by R
+
soft real numbers.
e E define:
For r̃, s̃ ∈ R
+
(r̃ ⊕ s̃)(e) := r̃(e) + s̃(e),
r̃ ≤ s̃ ⇐⇒ r̃(e) ≤ s̃(e) (∀e ∈ E).
e E be the zero soft real number, 0̃(e) = 0 for all e ∈ E .
Let 0̃ ∈ R
+
Definition 4.13.3 (Soft metric and soft metric space). Let X be a nonempty universe and let
E be a nonempty parameter set. A mapping
eE
dS : SE(X, E) × SE(X, E) −→ R
+
is called a soft metric (or soft distance) on (X, E) if, for all α, β, γ ∈ SE(X, E), the following
axioms hold:
(SM1) (Nonnegativity) 0̃ ≤ dS (α, β).
(SM2) (Identity of indiscernibles) dS (α, β) = 0̃ if and only if α = β .
(SM3) (Symmetry) dS (α, β) = dS (β, α).
93


# Page. 95

![Page Image](https://bcdn.docswell.com/page/K74W4MWPE1.jpg)

Chapter 4. Applications of Soft Set
(SM4) (Triangle inequality) dS (α, γ) ≤ dS (α, β) ⊕ dS (β, γ).
In this case, the triple (X, E, dS ) is called a soft metric space.
Remark 4.13.4 (Relation to the soft-point (product) viewpoint). In the original approach of
Das–Samanta, a “soft metric” is defined on the set of all soft points, which can be identified
with E × X ; hence it essentially reduces to an ordinary metric on the product set E × X . The
soft-element based definition above generalizes that viewpoint: every soft-point metric induces
a soft-element metric, but the converse need not hold.
Example 4.13.5 (A soft metric space for multi-context sensor readings). Let X = R be the set
of possible temperature readings, and let
E = {morning, noon, night}
be a parameter set representing three measurement contexts (time slots).
A soft element α ∈ SE(X, E) assigns to each e ∈ E a singleton α(e) = {xe }, so we identify α
with the triple (xmorning , xnoon , xnight ) ∈ R3 .
e E by, for α = (xe )e∈E and β = (ye )e∈E ,
Define dS : SE(X, E) × SE(X, E) → R
+
dS (α, β)(e) := xe − ye
(e ∈ E).
Equivalently, dS (α, β) is the soft real number e 7→ |xe − ye |.
Then dS is a soft metric:
• Nonnegativity: dS (α, β)(e) = |xe − ye | ≥ 0 for all e, hence 0̃ ≤ dS (α, β).
• Identity: dS (α, β) = 0̃ iff |xe − ye | = 0 for all e, i.e. xe = ye for all e, hence α = β .
• Symmetry: |xe − ye | = |ye − xe | gives dS (α, β) = dS (β, α).
• Triangle inequality: for each e ∈ E ,

dS (α, γ)(e) = |xe −ze | ≤ |xe −ye |+|ye −ze | = dS (α, β)(e)+dS (β, γ)(e) = dS (α, β)⊕dS (β, γ) (e),
hence dS (α, γ) ≤ dS (α, β) ⊕ dS (β, γ).
Therefore (X, E, dS ) is a soft metric space.
Interpretation. Each soft element represents a day’s temperature profile across contexts (morning/noon/night), and the soft distance returns the absolute discrepancy at each context as a
nonnegative soft real number.
94


# Page. 96

![Page Image](https://bcdn.docswell.com/page/LJ1Y48YXEG.jpg)

Chapter 4. Applications of Soft Set
4.14 Soft probabilities
Soft probabilities assign to each parameter a probability measure on a universe, representing
context-dependent stochastic uncertainty for decision-making tasks explicitly [281–283].
Definition 4.14.1 (Statistical database, admissible samples, and frequency). Let Ω be a nonempty
outcome space. A statistical database (of length N ∈ N) is a finite sequence
Base = (ω1 , ω2 , . . . , ωN ),
ωi ∈ Ω.
Fix an integer m with 1 ≤ m ≤ N and a “freshness” index τ with 1 ≤ τ ≤ N − m + 1. Define
the family of admissible samples (consecutive blocks) by

S(Base, m, τ ) := {i, i + 1, . . . , i + m − 1}
i = τ, τ + 1, . . . , N − m + 1 .
For an event A ⊆ Ω and a sample I ∈ S(Base, m, τ ), its frequency of occurrence on I is
(
1 X
1, ω ∈ A,
χA (ωi ),
χA (ω) =
µ(Base, A, I) :=
|I| i∈I
0, ω ∈
/ A.
Definition 4.14.2 (Soft probability (interval-valued, parameterized by (m, τ ))). Let Base, m,
and τ be as above. The soft probability of an event A ⊆ Ω on the database Base at parameters
(m, τ ) is the closed interval


Psoft (A | Base; m, τ ) := p(A), p(A) ⊆ [0, 1],
where
p(A) :=
Equivalently,
min
I∈S(Base,m,τ )
µ(Base, A, I),
p(A) :=
max
I∈S(Base,m,τ )
µ(Base, A, I).

Psoft (A | Base; m, τ ) = λ χA , Base, m, τ ,
where λ(f, Base, m, τ ) denotes the (m, τ )-approximate mean interval of a real-valued function
f : Ω → R:
&quot;
#
1 X
1 X
λ(f, Base, m, τ ) :=
min
f (ωi ),
max
f (ωi ) .
I∈S(Base,m,τ ) |I|
I∈S(Base,m,τ ) |I|
i∈I
i∈I
Example 4.14.3 (Soft probability for a delayed-train event under rolling windows). Let Ω =
{ω1 , . . . , ω10 } be ten commuting days, and let

10
Base = (ωi , yi ) i=1
be a database where yi ∈ {0, 1} indicates whether the train was delayed on day ωi (yi = 1 means
“delayed”). Consider the event
A := {ωi ∈ Ω : yi = 1}.
Assume the observed delay indicators are
(y1 , . . . , y10 ) = (1, 0, 1, 0, 0, 1, 1, 0, 0, 1).
95


# Page. 97

![Page Image](https://bcdn.docswell.com/page/GJWGXZGK72.jpg)

Chapter 4. Applications of Soft Set
Fix parameters (m, τ ) so that the admissible index-family is the set of all contiguous windows
of length m = 4 (this is a concrete choice of S(Base, m, τ )):
n
o
S(Base, 4, τ ) := {1, 2, 3, 4}, {2, 3, 4, 5}, . . . , {7, 8, 9, 10} .
For each window I ∈ S(Base, 4, τ ) define
µ(Base, A, I) :=
1X
1 X
χA (ωi ) =
yi ,
|I| i∈I
4 i∈I
i.e. the empirical delay rate inside that window.
Compute the window means:
I
{1, 2, 3, 4}
{2, 3, 4, 5}
{3, 4, 5, 6}
{4, 5, 6, 7}
{5, 6, 7, 8}
{6, 7, 8, 9}
{7, 8, 9, 10}
(yi )i∈I
µ(Base, A, I)
(1, 0, 1, 0) 2/4 = 0.50
(0, 1, 0, 0) 1/4 = 0.25
(1, 0, 0, 1) 2/4 = 0.50
(0, 0, 1, 1) 2/4 = 0.50
(0, 1, 1, 0) 2/4 = 0.50
(1, 1, 0, 0) 2/4 = 0.50
(1, 0, 0, 1) 2/4 = 0.50
Hence
p(A) =
min
I∈S(Base,4,τ )
µ(Base, A, I) = 0.25,
p(A) =
max
I∈S(Base,4,τ )
µ(Base, A, I) = 0.50.
Therefore the soft probability of “delay” on the database Base at parameters (4, τ ) is
Psoft (A | Base; 4, τ ) = [0.25, 0.50].
Interpretation. Depending on which admissible 4-day period (window) is regarded as representative under (m, τ ), the estimated delay probability ranges from 25% to 50%, so the uncertainty
is captured as an interval.
Remark 4.14.4 (Basic sanity properties). For fixed (Base, m, τ ):
1. Psoft (∅ | Base; m, τ ) = [0, 0] and Psoft (Ω | Base; m, τ ) = [1, 1].
2. If A ⊆ B , then Psoft (A | Base; m, τ ) ⊆ Psoft (B | Base; m, τ ) in the endpoint-wise order
(monotonicity).
3. If p(A) = p(A), the soft probability of A collapses to a classical (database-induced) point
probability.
96


# Page. 98

![Page Image](https://bcdn.docswell.com/page/4EZL61LN73.jpg)

Chapter 4. Applications of Soft Set
4.15 Soft SemiGroup
A semigroup is a nonempty set equipped with an associative binary operation, requiring closure
and no identity or inverses [98, 284, 285]. A soft semigroup is a soft set whose value at each
parameter is a subsemigroup of a fixed semigroup, closed under multiplication [286–289].
Definition 4.15.1 (Soft semigroup). Let (S, ·) be a semigroup, let E be a nonempty set of
parameters, and let A ⊆ E be nonempty. A soft semigroup over S is a soft set (F, A) over S ,
i.e.,
F : A −→ P(S),
such that for every parameter a ∈ A, the value set F (a) is a (classical) subsemigroup of S .
Equivalently, for each a ∈ A:
F (a) 6= ∅ and x · y ∈ F (a) for all x, y ∈ F (a).
Thus, a soft semigroup is a parameterized family of subsemigroups of the fixed ground semigroup
S.
Remark 4.15.2. If S is equipped with a compatible partial order ≤ (so (S, ·, ≤) is an ordered
semigroup), then (F, A) is often called a soft ordered semigroup when each F (a) is a subsemigroup
of S (for all a ∈ A).
Example 4.15.3 (A soft semigroup on (N0 , +)). Let (S, ·) = (N0 , +) be the additive semigroup
of nonnegative integers. Let the parameter set be
E = {Even, Mult3, AtLeast5},
A = E.
Define a soft set (F, A) over S by
F (Even) = {0, 2, 4, 6, . . . },
F (Mult3) = {0, 3, 6, 9, . . . },
F (AtLeast5) = {5, 6, 7, . . . }.
Then for each a ∈ A, the value F (a) is a nonempty subsemigroup of (N0 , +):
• If x, y ∈ F (Even), then x + y is even, so x + y ∈ F (Even).
• If x, y ∈ F (Mult3), then x + y is a multiple of 3, so x + y ∈ F (Mult3).
• If x, y ∈ F (AtLeast5), then x + y ≥ 10, hence x + y ∈ F (AtLeast5).
Therefore (F, A) is a soft semigroup over S in the sense of Definition 4.15.1.
Real-life interpretation. Let S represent feasible daily production counts in a factory. The
parameter Even enforces pairing/packaging constraints (even batch sizes), Mult3 models palletization in groups of three, and AtLeast5 encodes a minimum-run policy; each policy set is closed
under combining runs.
97


# Page. 99

![Page Image](https://bcdn.docswell.com/page/Y76W2LW97V.jpg)

Chapter 4. Applications of Soft Set
4.16 Soft HyperStructure and SuperHyperStructure
A hyperstructure is an algebraic system whose binary operation assigns each element-pair a
nonempty subset, generalizing ordinary operations [290,291]. A superhyperstructure iterates hyperstructural levels so hyperoperations act on set-valued objects across multiple power-set layers,
forming hierarchies [292]. A soft hyperstructure is a parameterized family of subhyperstructures,
assigning each parameter a nonempty subset closed under the underlying hyperoperation(s) of a
given hyperalgebra [293–295].
Definition 4.16.1 (Soft HyperStructure). [293–295] Let H be a hyperalgebra (hyperstructure)
with signature
Σ = { ?i : H ni → P∗ (H) | i ∈ I },
where P∗ (H) denotes the family of all nonempty subsets of H .
A subset K ⊆ H is called a subhyperstructure of H if for every i ∈ I and every (x1 , . . . , xni ) ∈
K ni one has
?i (x1 , . . . , xni ) ⊆ K.
Let A be a nonempty parameter set. A (non-null) soft set over H is a pair (F, A) with
F : A → P∗ (H),
Supp(F, A) := {a ∈ A : F (a) 6= ∅} 6= ∅.
Then (F, A) is called a Soft HyperStructure over H if for every a ∈ Supp(F, A), the subset
F (a) ⊆ H is a subhyperstructure of H .
Definition 4.16.2 (Soft SuperHyperStructure). Let SH = SH(H, F) be a superhyperstructure
with a levelled universe {H hmi }m≥0 and a family of superhyperoperations

F = { Fj }j∈J ,
Fj : H h`j,1 i × · · · × H h`j,nj i −→ P∗ H hrj i ,
where `j,1 , . . . , `j,nj , rj ≥ 0.
Let A be a nonempty parameter set. A soft set over the levelled universe is a mapping
Y


S : A −→
P H hmi ,
a 7−→ Sahmi m≥0 ,
m≥0
and its support is
Supp(S) := {a ∈ A : ∃m ≥ 0, Sahmi =
6 ∅}.
For each j ∈ J , extend Fj to subsets by the usual set-extension: for Xk ⊆ H h`j,k i ,
[
Fj (X1 , . . . , Xnj ) :=
Fj (x1 , . . . , xnj ).
(x1 ,...,xnj )∈X1 ×···×Xnj
The soft set S is called a Soft SuperHyperStructure over SH(H, F) if for every a ∈ Supp(S) and
every j ∈ J one has the levelwise closure condition


h`j,n i
Fj Sah`j,1 i , . . . , Sa j
⊆ Sahrj i .

Equivalently, for each a ∈ Supp(S), the family Sahmi m≥0 determines a sub-superhyperstructure
of SH(H, F) under the restrictions of all operations Fj .
98


# Page. 100

![Page Image](https://bcdn.docswell.com/page/G75M21MD74.jpg)

Chapter 4. Applications of Soft Set
Example 4.16.3 (A soft superhyperstructure for a multi-level academic program catalogue).
Let the base (level-0) universe H h0i be the set of courses
H h0i = {c1 , c2 , c3 , c4 },
interpreted as (say) Linear Algebra (c1 ), Discrete Math (c2 ), Machine Learning (c3 ), and Databases
(c4 ).
Define level-1 objects as course bundles (nonempty sets of courses):

H h1i := P∗ H h0i ,
and level-2 objects as program tracks (nonempty sets of bundles):

H h2i := P∗ H h1i .
Consider two superhyperoperations (so J = {1, 2}):

F1 : H h0i × H h0i −→ P∗ H h1i ,

F1 (x, y) := {x, y} ,
which forms a 2-course bundle, and

F2 : H h1i × H h1i −→ P∗ H h2i ,

F2 (B1 , B2 ) := {B1 , B2 } ,
which forms a track consisting of two bundles. Let SH = SH(H, F) with F = {F1 , F2 }.
Let the parameter set be
A = {AI, DB},
representing two departments that curate their own multi-level catalogues.
Define a soft set over the levelled universe,
S : A −→ P(H h0i ) × P(H h1i ) × P(H h2i ),

a 7−→ Sah0i , Sah1i , Sah2i ,
by
h0i
h0i
SAI = {c1 , c3 },

h1i
SAI = {c1 , c3 } ,

h2i
SAI = {{c1 , c3 }} ,
SDB = {c2 , c4 },

h1i
SDB = {c2 , c4 } ,

h2i
SDB = {{c2 , c4 }} .
Verification of the closure conditions. For a = AI we have
[ 

h0i
h0i 
h1i
F1 SAI , SAI =
{x, y} ⊆ {c1 , c3 } = SAI ,
x,y∈{c1 ,c3 }
and
h1i
h1i 
F2 SAI , SAI

h2i
= {{c1 , c3 }} = SAI .
The same argument holds for a = DB. Hence, for each a ∈ A and each operation Fj ,


h`j,n i
Fj Sah`j,1 i , . . . , Sa j ⊆ Sahrj i ,
so S is a Soft SuperHyperStructure over SH(H, F).
Real-life interpretation. Each department parameter a ∈ A selects its admissible courses
(level 0), the bundles it permits (level 1), and the tracks it offers (level 2), while ensuring
that combining allowed items via the curriculum-construction rules F1 and F2 stays within the
department’s own catalogue.
99


# Page. 101

![Page Image](https://bcdn.docswell.com/page/9J2941RMER.jpg)

Chapter 4. Applications of Soft Set
4.17 Soft Graph Neural Networks
A Graph Neural Network (GNN) learns node- or graph-level representations by iterative message
passing, aggregating neighbors’ features to predict labels or properties [296–301]. A Soft Graph
Neural Network (SGNN) is a parameter-indexed family of GNN-based selections, where each
context induces a soft set of chosen nodes via thresholds. Let G = (V, E) be a finite graph
(directed or undirected) with a node-feature map X : V → Rd . In many learning tasks, a
graph neural network (GNN) produces, for each node v ∈ V , a score s(v) ∈ [0, 1] (e.g., a classprobability after a sigmoid/softmax). For our purposes, it suffices to treat a GNN as a black box
that, given (G, X) and a choice of model/context parameters, returns such a score function.
Definition 4.17.1 (Soft Graph Neural Network (SGNN)). Let G = (V, E) be a finite graph
with node features X : V → Rd . Let A be a nonempty set of soft parameters (contexts), such
as: task modes, expert profiles, time indices, hyperparameter regimes, or view-definitions.
A Soft Graph Neural Network (SGNN) on (G, X) with parameter set A is a family

N = (sa , τa ) a∈A ,
where for each a ∈ A:
• sa : V → [0, 1] is a node-scoring function produced by a (possibly shared-weight) GNN
under context a (for instance sa = Readout ◦ Encodera (G, X));
• τa ∈ [0, 1] is a (possibly a-dependent) decision threshold.
Definition 4.17.2 (Soft set induced by an SGNN). Let N = {(sa , τa )}a∈A be an SGNN as in
Definition 4.17.1. Define a mapping
FN : A −→ P(V )
by
FN (a) := { v ∈ V | sa (v) ≥ τa }.
Then (FN , A) is called the SGNN-induced soft set (of selected nodes).
Theorem 4.17.3 (Soft-set structure and well-definedness of SGNN outputs). Let N = {(sa , τa )}a∈A
be an SGNN on (G, X). Then the mapping FN : A → P(V ) in Definition 4.17.2 is well-defined,
and hence (FN , A) is a (crisp) soft set over the universe V .
Proof. Fix a ∈ A. By definition, sa : V → [0, 1] and τa ∈ [0, 1]. Therefore the predicate
“sa (v) ≥ τa ” is meaningful for every v ∈ V , and the set
FN (a) = {v ∈ V | sa (v) ≥ τa }
is a subset of V . Since this holds for every a ∈ A, the rule a 7→ FN (a) defines a single-valued
mapping FN : A → P(V ). Consequently, (FN , A) is a soft set over the universe V .
Theorem 4.17.4 (Representation of any soft set by a trivial SGNN). Let (F, A) be any (crisp)
soft set over a finite universe V . Then there exists an SGNN N = {(sa , τa )}a∈A on (G, X) (for
any fixed graph G on vertex set V and any features X ) such that FN (a) = F (a) for all a ∈ A.
100


# Page. 102

![Page Image](https://bcdn.docswell.com/page/DEY4MZDPJM.jpg)

Chapter 4. Applications of Soft Set
Proof. Define, for each a ∈ A, the score function sa : V → [0, 1] by the indicator rule
(
1, v ∈ F (a),
sa (v) :=
and set
τa := 12 .
0, v ∈
/ F (a),
Then FN (a) = {v ∈ V | sa (v) ≥ τa } = F (a). Such (sa , τa ) can be realized by a degenerate
“network” that ignores (G, X) and outputs constants. Hence an SGNN can reproduce any given
soft set, so the SGNN formalism generalizes soft sets.
Remark 4.17.5 (Edge-version). One may analogously define an SGNN-induced soft set of
edges by letting sE
a : E → [0, 1] be an edge-score function (e.g., attention weights) and defining
E
FNE (a) = {ε ∈ E | sE
a (ε) ≥ τa }, yielding a soft set over universe E .
4.18 HyperSoft Graph Neural Network
HyperSoft Graph Neural Network maps multi-attribute parameter tuples to GNN-based node
selections, yielding hypersoft sets over graph vertices.
Definition 4.18.1 (Hypersoft parameter space). Let m ≥ 1 and let A1 , . . . , Am be nonempty
attribute domains (parameter groups), e.g.,
A1 = {task modes},
A2 = {expert profiles},
A3 = {time contexts},
A4 = {hyperparameter regimes}, etc.
Define the hypersoft parameter space by the Cartesian product
C := A1 × A2 × · · · × Am .
An element a ∈ C is an m-tuple a = (a1 , . . . , am ) with ai ∈ Ai .
Definition 4.18.2 (HyperSoft Graph Neural Network (HSGNN)). Let G = (V, E) beQ
a finite
m
graph and let X : V → Rd be a node-feature map. Fix a hypersoft parameter space C = i=1 Ai
as in Definition 4.18.1.
A HyperSoft Graph Neural Network (HSGNN) on (G, X) with parameter space C is a family

N = (sa , τa ) a∈C ,
where, for each a ∈ C :
• sa : V → [0, 1] is a node-scoring function produced by a (possibly shared-weight) GNN
under context a; concretely, one may fix an architecture
Φ : (G, X, θ) 7−→ Φ(G, X; θ) ∈ [0, 1]V
and a parameter-selection map θ : C → Θ and set sa (·) := Φ(G, X; θ(a))(·);
• τa ∈ [0, 1] is a (possibly context-dependent) decision threshold.
101


# Page. 103

![Page Image](https://bcdn.docswell.com/page/VJNYW36M78.jpg)

Chapter 4. Applications of Soft Set
Definition 4.18.3 (Hypersoft set induced by an HSGNN). Let N = {(sa , τa )}a∈C be an HSGNN as in Definition 4.18.2. Define
FN : C −→ P(V )
by
FN (a) := { v ∈ V | sa (v) ≥ τa }.
Then (FN , C) is called the HSGNN-induced hypersoft set (of selected nodes) over the universe
V.
Theorem 4.18.4 (Hypersoft-set structure and well-definedness). Let N be an HSGNN on
(G, X) with hypersoft parameter space C . Then the induced mapping FN in Definition 4.18.3 is
well-defined and determines a hypersoft set over the universe V , i.e.,
(FN , C) is a hypersoft set on V.
Proof. Fix a ∈ C . By Definition 4.18.2, sa : V → [0, 1] is a function and τa ∈ [0, 1] is a scalar.
Hence the subset
FN (a) = { v ∈ V | sa (v) ≥ τa }
is uniquely determined and satisfies FN (a) ⊆ V . Therefore the assignment a 7→ FN (a) defines
a function FN : C → P(V ), so (FN , C) is a hypersoft set by the definition of hypersoft sets
(mapping from the Cartesian product parameter space into a powerset). This also proves welldefinedness.
4.19 Soft Natural Languages
A natural language is a human communication system with grammar and meaning, enabling compositional expression and context-dependent interpretation [302–305]. A soft natural language
models ambiguous linguistic phenomena by mapping contexts/parameters to sets of admissible
interpretations, enabling structured uncertainty-aware processing. Let Σ be a (finite or countable) alphabet and let Σ∗ denote the set of all finite strings over Σ. Fix a nonempty set
U ⊆ Σ∗ ,
whose elements are regarded as linguistic objects (e.g., utterances, sentences, queries, or documents).
Let E be a nonempty set of linguistic parameters (contexts), whose elements may encode a
natural language label (English, Japanese, etc.), a dialect, a register (formal/informal), a domain
(medical/legal), or a time slice.
An acceptability relation is any binary relation
|= ⊆ U × E,
where u |= e is read as “the linguistic object u is acceptable under parameter e”.
102


# Page. 104

![Page Image](https://bcdn.docswell.com/page/YE9PX9LWJ3.jpg)

Chapter 4. Applications of Soft Set
Definition 4.19.1 (Soft Natural Languages). Let U be a linguistic universe and let E be a
parameter set as above. Fix a nonempty subset A ⊆ E of parameters and an acceptability
relation |=⊆ U × A.
Define the mapping
F|= : A −→ P(U ),
The pair
F|= (a) := { u ∈ U | u |= a }.
SNL := (F|= , A)
is called a Soft Natural Languages structure on U (with parameter set A). For each a ∈ A, the
set F|= (a) is the collection of utterances judged acceptable under the linguistic context a.
Theorem 4.19.2 (Soft-set structure and well-definedness). Every Soft Natural Languages structure SNL = (F|= , A) in Definition 4.19.1 is a (crisp) soft set over the universe U . Equivalently,
F|= is a well-defined mapping A → P(U ).
Proof. Fix any a ∈ A. By construction,
F|= (a) = {u ∈ U | u |= a}
is a subset of U , hence F|= (a) ∈ P(U ). Since this holds for every a ∈ A, the rule a 7→ F|= (a)
defines a single-valued mapping F|= : A → P(U ). Therefore (F|= , A) is a soft set over U in the
sense of Molodtsov.
Theorem 4.19.3 (Representation of soft sets as Soft Natural Languages). Let (F, A) be any
soft set over the linguistic universe U (i.e., F : A → P(U )). Define a relation |=F ⊆ U × A by
u |=F a
:⇐⇒
u ∈ F (a).
Then the induced mapping F|=F satisfies F|=F = F , hence (F, A) is exactly the Soft Natural
Languages structure generated by |=F .
Proof. For each a ∈ A,
F|=F (a) = {u ∈ U | u |=F a} = {u ∈ U | u ∈ F (a)} = F (a).
Thus F|=F = F pointwise on A, proving the claim.
4.20 Soft n-SuperHyperGraphs
An n-SuperHyperGraph is a hierarchical hypergraph in which vertices are built by iterating
the nonempty powerset construction n times on a finite base set, and each superhyperedge
is a nonempty family of such n-supervertices encoding higher-order interactions [15, 306, 307].
A soft n-SuperHyperGraph is a parameterized collection of induced sub-n-SuperHyperGraphs,
where each parameter selects a subset of n-supervertices together with a compatible subset of
superhyperedges whose endpoints lie entirely within the selected vertices [308, 309].
103


# Page. 105

![Page Image](https://bcdn.docswell.com/page/GE8D29XRED.jpg)

Chapter 4. Applications of Soft Set
Definition 4.20.1 (Iterated powerset and iterated nonempty powerset). [292, 310, 311] Let H
be a nonempty set. Define the k -th iterated powerset P k (H) recursively by

P 0 (H) := H,
P k+1 (H) := P P k (H) (k ≥ 0),
where P(·) denotes the usual powerset.
Define the k -th iterated nonempty powerset P∗k (H) by omitting ∅ at every level:

P∗0 (H) := H,
P∗k+1 (H) := P P∗k (H) \ {∅} (k ≥ 0).
Definition 4.20.2 (n-SuperHyperGraph). [15, 307, 312, 313] Let V0 be a finite nonempty base
vertex set, and let n ≥ 1 be an integer. Set
Vn (V0 ) := P∗n−1 (V0 ) (⊆ P n−1 (V0 )).
Elements of Vn (V0 ) are called n-supervertices.
An n-SuperHyperGraph is a pair
where
SHG(n) = (V, E),
V ⊆ Vn (V0 ) is finite and nonempty,
E ⊆ P(V ) \ {∅}.
Each element of E is called an n-superhyperedge. (Thus every superhyperedge is a nonempty
subset of the n-supervertex set V .)
In particular, for n = 1 one has V1 (V0 ) = V0 , so SHG(1) is an ordinary (finite) hypergraph on V0 .
For n = 2, vertices are nonempty subsets of V0 (groups of base vertices), and edges are families
of such groups.
Definition 4.20.3 (Soft n-SuperHyperGraph). [314, 315] Let SHG(n) = (V, E) be an nSuperHyperGraph and let C be a nonempty set of parameters. A soft n-SuperHyperGraph
(over SHG(n) with parameter set C ) is a 5-tuple

V, E, C, A, B ,
where
A : C −→ P(V ),
B : C −→ P(E),
such that for each c ∈ C the pair

A(c), B(c)
is a (parameter-induced) sub-superhypergraph of SHG(n) , i.e.,
A(c) ⊆ V,
B(c) ⊆ { e ∈ E : e ⊆ A(c) }.
Example 4.20.4 (A soft 2-superhypergraph for university course grouping). Let the base vertex
set be a small set of students
V0 = {s1 , s2 , s3 , s4 , s5 }.
104


# Page. 106

![Page Image](https://bcdn.docswell.com/page/LELM2W827R.jpg)

Chapter 4. Applications of Soft Set
Since V2 (V0 ) = P∗ (V0 ), a 2-supervertex is a nonempty group of students. Define a 2-SuperHyperGraph
SHG(2) = (V, E) by
n
o
V = v1 = {s1 , s2 }, v2 = {s2 , s3 }, v3 = {s4 }, v4 = {s5 }, v5 = {s3 , s4 } ⊆ V2 (V0 ),
and by specifying superhyperedges as families of these student-groups:
n
o
E = e1 = {v1 , v2 , v5 }, e2 = {v2 , v3 }, e3 = {v3 , v4 }, e4 = {v1 , v4 } ⊆ P(V ) \ {∅}.
Interpretation: each vi is a base study-group, and each e ∈ E is a higher-order collaboration
constraint among several groups (e.g., a shared project, lab session, or timetable coupling).
Let C = {AI, DB, Math} be course-topic parameters. Define
A : C → P(V ),
B : C → P(E),
by selecting, for each topic, the relevant student-groups and the collaboration constraints among
them:
A(AI) = {v1 , v2 , v5 },
B(AI) = {e1 },
A(DB) = {v2 , v3 },
B(DB) = {e2 },
A(Math) = {v3 , v4 },
B(Math) = {e3 }.
Then for every c ∈ C we have A(c) ⊆ V and
B(c) ⊆ { e ∈ E : e ⊆ A(c) },
so (A(c), B(c)) is a sub-superhypergraph of SHG(2) . Hence

V, E, C, A, B
is a soft 2-superhypergraph, encoding topic-dependent selections of student-groups and their
higher-order collaboration constraints.
4.21 Recursive Soft SuperHyperGraph
An (n, k)-recursive SuperHyperGraph has level-n supervertices (iterated powersets) and depth-k
recursive edges that may include supervertices and lower-level edges as elements. We restrict to
well-founded recursive superhyperedges (no membership cycles).
Definition 4.21.1 ((n, k)-recursive SuperHyperGraph). [316] Fix a base (ground) set V0 and
let n, k ∈ N ∪ {0}.
(Iterated powersets). Define the iterated powersets by
P 0 (V0 ) = V0 ,

P n+1 (V0 ) = P P n (V0 ) (n ≥ 0).
A (n, k)-recursive SuperHyperGraph is a pair
RSHG(n,k) = (V, E)
satisfying:
105


# Page. 107

![Page Image](https://bcdn.docswell.com/page/4JMY8969JW.jpg)

Chapter 4. Applications of Soft Set
(i) (Hierarchical supervertex set). V ⊆ P n (V0 ).
(ii) (Recursive superhyperedge family). E ⊆ 2V,k \ {∅}, where 2V,k is the depth-k powerset
universe constructed from S = V as in Definition ??.
To express the requirement that a recursive superhyperedge uses only vertices from a chosen
subset, we define a vertex-support operator
suppk : P hki (V ) −→ P(V )
by recursion:
supp0 (v) := {v} (v ∈ V ),
suppk+1 (X) :=
[
suppk (Y )

X ∈ P hk+1i (V ) .
Y ∈X
Thus, suppk (x) is the set of all base vertices that appear anywhere inside the nested object x.
Remark 4.21.2. If W ⊆ V , then for any x ∈ P hki (V ) we have
x ∈ P hki (W )
=⇒
suppk (x) ⊆ W.
Conversely, if one defines the restriction of recursive edges to W by
E W := { e ∈ E : suppk (e) ⊆ W },
then E W consists precisely of those recursive edges of E that only use vertices from W .
We recall that an (n, k)-recursive SuperHyperGraph is a pair RSHG(n,k) = (V, E) where V ⊆
P n (V0 ) for a fixed ground set V0 and E ⊆ P hki (V ) \ {∅}.
Definition 4.21.3 ((n, k)-recursive soft superhypergraph). Let RSHG(n,k) = (V, E) be an
(n, k)-recursive SuperHyperGraph, and let C be a nonempty set of parameters. An (n, k)recursive soft SuperHyperGraph (over RSHG(n,k) with parameter set C ) is a quintuple
S := (V, E, C, A, B),
where
A : C → P(V ),
B : C → P(E),
such that for every c ∈ C the pair

A(c), B(c)
is a sub-(n, k)-recursive SuperHyperGraph of (V, E) in the following sense:
(RS1) (Vertex inclusion) A(c) ⊆ V ;
(RS2) (Recursive-edge compatibility) for every e ∈ B(c) one has
suppk (e) ⊆ A(c).
Equivalently,
B(c) ⊆ E A(c) := { e ∈ E : suppk (e) ⊆ A(c) }.
106


# Page. 108

![Page Image](https://bcdn.docswell.com/page/PJR95GP979.jpg)

Chapter 4. Applications of Soft Set
h1i
Remark 4.21.4. When
S k = 1, each recursive edge e ∈ P (V ) = P(V ) is simply a subset
of V , and supp1 (e) = v∈e {v} = e. Hence (RS2) becomes the usual induced-edge condition
e ⊆ A(c).
Theorem 4.21.5 (Simultaneous generalization). The concept in Definition 4.21.3 generalizes
both
(a) (n, k)-recursive SuperHyperGraphs, and
(b) soft n-SuperHyperGraphs (soft SuperHyperGraphs in the non-recursive sense).
Proof. (a) Let RSHG(n,k) = (V, E) be any (n, k)-recursive SuperHyperGraph. Take the singleton
parameter set C := {∗} and define
A(∗) := V,
B(∗) := E.
Then (RS1) holds trivially. For (RS2), every e ∈ B(∗) = E satisfies suppk (e) ⊆ V = A(∗) by
definition of suppk . Hence (V, E, C, A, B) is an (n, k)-recursive soft SuperHyperGraph. Moreover, forgetting the (trivial) parameterization recovers exactly (V, E).
(b) Let (V, E, C, A, B) be an (n, 1)-recursive soft SuperHyperGraph. Since k = 1, we have
E ⊆ P h1i (V ) \ {∅} = P(V ) \ {∅}, so each edge e ∈ E is a nonempty subset of V . By
Remark 4.21.4, condition (RS2) becomes
e ∈ B(c) =⇒ e ⊆ A(c),
equivalently,
B(c) ⊆ { e ∈ E : e ⊆ A(c) }.
Thus (V, E, C, A, B) is precisely a soft n-SuperHyperGraph in the standard (non-recursive)
sense.
Conversely, any soft n-SuperHyperGraph (V, E, C, A, B) (with E ⊆ P(V ) \ {∅} and B(c) ⊆
{e ∈ E : e ⊆ A(c)}) is an (n, 1)-recursive soft SuperHyperGraph because supp1 (e) = e for all
e⊆V.
4.22 Hierarchical Soft SuperHyperGraph
A hierarchical superhypergraph permits vertices from several powerset levels and allows edges
to join mixed-level vertices, while enforcing a downward-closure coherence principle [208].
Definition 4.22.1 (Nonempty powerset tower and hierarchical universe). Let V0 be a finite
nonempty base set and fix r ∈ N0 . Define a nonempty powerset tower (P hki (V0 ))rk=0 by

P h0i (V0 ) := V0 ,
P hk+1i (V0 ) := P P hki (V0 ) \ {∅} (0 ≤ k &lt; r).
Set the hierarchical universe
Ur (V0 ) :=
r
[
P hki (V0 ).
k=0
For x ∈ Ur (V0 ), define its level by
`(x) := min{ k ∈ {0, 1, . . . , r} : x ∈ P hki (V0 ) }.
107


# Page. 109

![Page Image](https://bcdn.docswell.com/page/PEXQKX33JX.jpg)

Chapter 4. Applications of Soft Set
Definition 4.22.2 (Downward closure). Let W ⊆ Ur (V0 ). Define recursively
[
D0 (W ) := W,
Dt+1 (W ) := Dt (W ) ∪
{ X | X ∈ Dt (W ), `(X) ≥ 1 }
(t ≥ 0).
Since the tower height is r, the sequence stabilizes by step r; we define the downward closure of
W by
dcl(W ) := Dr (W ).
Then dcl(W ) is the smallest subset of Ur (V0 ) that contains W and satisfies: if X ∈ dcl(W ) with
`(X) ≥ 1, then X ⊆ dcl(W ).
Definition 4.22.3 (Hierarchical SuperHyperGraph of height r). Let V0 be a finite nonempty
base set and fix r ∈ N0 . Let Ur (V0 ) and `(·) be as in Definition 4.22.1. A hierarchical SuperHyperGraph of height r on V0 is a pair
Hhri = (V, E)
satisfying:
(H1) (Hierarchical vertex set). V is a finite nonempty set with V ⊆ Ur (V0 ).
(H2) (Cross-level hyperedges). E ⊆ P(V ) \ {∅}.
(H3) (Coherence / downward closure). If X ∈ V and `(X) ≥ 1, then X ⊆ V .
For each k ∈ {0, . . . , r}, define the k -th layer
Vk := { x ∈ V : `(x) = k },
so that
V =
[
˙r
k=0
Vk .
Definition 4.22.4 (Hierarchical Soft SuperHyperGraph). Let Hhri = (V, E) be a hierarchical
SuperHyperGraph of height r on a base set V0 in the sense of Definition 4.22.3. Let C be a
nonempty set of parameters.
A hierarchical soft SuperHyperGraph (of height r) over Hhri is a quintuple

S = V, E, C, A, B ,
where
A : C −→ P(V ),
B : C −→ P(E),
such that for every c ∈ C :
(HS1) (Downward-closed slice vertices). If X ∈ A(c) and `(X) ≥ 1, then X ⊆ A(c).
(HS2) (Induced slice edges). B(c) ⊆ { e ∈ E : e ⊆ A(c) }.
For each c ∈ C , the pair
S[c] :=

A(c), B(c)
is called the parameter slice (a sub-hierarchical-superhypergraph) at c.
108


# Page. 110

![Page Image](https://bcdn.docswell.com/page/3EK95WYNED.jpg)

Chapter 4. Applications of Soft Set
Proposition 4.22.5 (Each slice is a hierarchical SuperHyperGraph). Let S = (V, E, C, A, B)
be a hierarchical soft SuperHyperGraph as in Definition 4.22.4. Then for every c ∈ C , the slice
S[c] = (A(c), B(c)) is a hierarchical SuperHyperGraph (possibly empty-vertex if one allows it; if
one requires nonempty vertex sets, assume A(c) =
6 ∅).
Proof. Fix c ∈ C . Since A(c) ⊆ V ⊆ Ur (V0 ), the vertex condition holds. Also B(c) ⊆ E ⊆
P(V ) \ {∅} and by (HS2) each e ∈ B(c) satisfies e ⊆ A(c), hence B(c) ⊆ P(A(c)) \ {∅}. Finally,
(HS1) is exactly the coherence (downward-closure) axiom for the slice.
Theorem 4.22.6 (Hierarchical SuperHyperGraphs are special cases). Let Hhri = (V, E) be a
hierarchical SuperHyperGraph. Define a singleton parameter set C := {c0 } and maps
A(c0 ) := V,
B(c0 ) := E.
Then S = (V, E, C, A, B) is a hierarchical soft SuperHyperGraph, and its unique slice S[c0 ]
coincides with Hhri .
Proof. Since A(c0 ) = V , the slice-vertex downward-closure (HS1) holds because V satisfies
(H3). Also B(c0 ) = E ⊆ {e ∈ E : e ⊆ V }, so (HS2) holds. Thus S is hierarchical soft, and
S[c0 ] = (V, E) = Hhri .
Theorem 4.22.7 (Soft n-SuperHyperGraphs embed into hierarchical soft SuperHyperGraphs).
Fix n ≥ 1 and let V0 be a finite base set. Let SHG(n) = (Vn , E) be an n-SuperHyperGraph whose
vertex set satisfies Vn ⊆ P hni (V0 ), and let

Hhni := dcl(Vn ), E
be the hierarchical structure obtained by closing the vertex set downward (Definition 4.22.2),
keeping the same hyperedge family E ⊆ P(Vn ) \ {∅}.
Let (Vn , E, C, An , B) be a soft n-SuperHyperGraph (i.e., An : C → P(Vn ) and B : C → P(E)
with B(c) ⊆ {e ∈ E : e ⊆ An (c)}). Define

A(c) := dcl An (c) ⊆ dcl(Vn ),
B(c) := B(c) ⊆ E (c ∈ C).
Then
S =
dcl(Vn ), E, C, A, B

is a hierarchical soft SuperHyperGraph of height n over Hhni . Moreover, restricting each slice to
the top layer recovers the original soft n-SuperHyperGraph:
An (c) = A(c) ∩ Vn ,
B(c) = B(c) (c ∈ C).
Proof. First, dcl(Vn ) ⊆ Un (V0 ) by construction, and it is downward closed by Definition 4.22.2.
Hence Hhni = (dcl(Vn ), E) satisfies (H1) and (H3). Since E ⊆ P(Vn ) \ {∅} and Vn ⊆ dcl(Vn ),
we also have E ⊆ P(dcl(Vn )) \ {∅}, so (H2) holds.
Now fix c ∈ C . By definition, A(c) = dcl(An (c)) is downward closed, so (HS1) holds. For (HS2),
take any e ∈ B(c). In the original soft n-SuperHyperGraph, e ⊆ An (c). Since An (c) ⊆ A(c), it
follows that e ⊆ A(c), hence e ∈ {e0 ∈ E : e0 ⊆ A(c)}. Therefore B(c) ⊆ {e ∈ E : e ⊆ A(c)},
establishing (HS2). Thus S is a hierarchical soft SuperHyperGraph.
Finally, since An (c) ⊆ Vn and Vn is precisely the top-level part of dcl(Vn ), downward closure adds
only lower-level constituents; hence A(c) ∩ Vn = An (c). The edge component is unchanged.
109


# Page. 111

![Page Image](https://bcdn.docswell.com/page/L73WK19Z75.jpg)



# Page. 112

![Page Image](https://bcdn.docswell.com/page/87DK3XG4JG.jpg)

Chapter 5
Soft Decision-Making
In this chapter, we examine several soft decision-making methods.
5.1 Soft decision-making
Soft decision-making ranks finitely many alternatives by weighted counts of satisfied parameters
in a soft set, and selects the argmax scorers. Related frameworks include fuzzy decision-making
[317, 318] and neutrosophic decision-making [319, 320].
Definition 5.1.1 (Soft decision-making model induced by a soft set). Let U be a nonempty
finite set of alternatives (objects) and let E be a nonempty set of parameters (criteria). Fix a
nonempty subset A ⊆ E and a soft set (F, A) over U , i.e.,
F : A −→ P(U ).
Let w : A → [0, ∞) be a weight (importance) function.
For each alternative u ∈ U define the (crisp) satisfaction indicator under a ∈ A by
(
1, u ∈ F (a),
χF (u, a) :=
0, u ∈
/ F (a).
The soft decision score of u (with respect to (F, A) and w) is the real number
X
S(F,A),w (u) :=
w(a) χF (u, a).
a∈A
The induced soft decision correspondence is
DM(F, A; w) := arg max S(F,A),w (u) ⊆ U,
u∈U
i.e., the set of all alternatives having maximum score. Equivalently, one may define the induced
(pre)order (F,A),w on U by
u (F,A),w v
⇐⇒
S(F,A),w (u) ≥ S(F,A),w (v),
and then DM(F, A; w) is the set of (F,A),w -maximal elements.
111


# Page. 113

![Page Image](https://bcdn.docswell.com/page/VJPK4P3VE8.jpg)

Chapter 5. Soft Decision-Making
Theorem 5.1.2 (Soft-set structure and well-definedness). In the setting of Definition 5.1.1:
(i) Soft-set structure. The data used by the decision model is exactly a soft set (F, A) over U
(together with a weight map w).
(ii) Well-defined score. The mapping S(F,A),w : U → R is well-defined (unique value for each
u ∈ U ).
(iii) Existence of optimal decisions. The decision correspondence DM(F, A; w) is well-defined
and nonempty.
(iv) Invariance under extension to the full parameter set. Define the extension Fe : E →
P(U ) by
(
F (e), e ∈ A,
Fe (e) =
∅,
e ∈ E \ A.
Then the decision outcome computed from (F, A) equals the outcome computed from Fe restricted to A; in particular, adding “unused” parameters with empty images does not change
DM(F, A; w).
Proof. (i) By assumption, A ⊆ E is nonempty and F : A → P(U ); hence (F, A) is, by definition,
a soft set over U . The additional map w : A → [0, ∞) only provides importance weights and
does not alter the underlying soft-set structure.
(ii) Fix u ∈ U . For each a ∈ A, the statement u ∈ F (a) is unambiguous, so χF (u, a) ∈ {0, 1}
is uniquely determined.
Since A is finite and w(a)χF (u, a) ∈ [0, ∞) for each a, the finite sum
P
S(F,A),w (u) = a∈A w(a)χF (u, a) is a uniquely determined real number. Thus S(F,A),w : U → R
is well-defined.
(iii) Because U is finite and S(F,A),w (u) ∈ R is well-defined for every u ∈ U by (ii), the maximum value maxu∈U S(F,A),w (u) exists. Therefore the argmax set DM(F, A; w) = {u ∈ U :
S(F,A),w (u) = maxv∈U S(F,A),w (v)} is a well-defined subset of U and is nonempty.
(iv) For a ∈ A we have Fe (a) = F (a) by definition of Fe , hence χFe (u, a) = χF (u, a) for all u ∈ U
and a ∈ A. Therefore, for every u ∈ U ,
S(Fe|A ,A),w (u) =
X
w(a)χFe (u, a) =
a∈A
X
w(a)χF (u, a) = S(F,A),w (u).
a∈A
Thus the score functions coincide on U , and consequently their argmax sets coincide: DM(Fe |A , A; w) =
DM(F, A; w).
112


# Page. 114

![Page Image](https://bcdn.docswell.com/page/2EVVX24REQ.jpg)

Chapter 5. Soft Decision-Making
5.2 HyperSoft TOPSIS and SuperHyperSoft TOPSIS
TOPSIS ranks alternatives by distance to positive ideal and negative ideal points, using normalized weighted criteria values [29, 321–323]. HyperSoft TOPSIS ranks alternatives using TOPSIS
on multi-attribute value tuples (hypersoft parameters) with weighted distances to ideal solutions. SuperHyperSoft TOPSIS extends HyperSoft TOPSIS by allowing each attribute to be a
value-subset, enabling set-valued criteria tuples.
Definition 5.2.1 (Hypersoft criterion domain). Let U = {u1 , . . . , un } be a finite set of alternatives. Let A1 , . . . , Ak be pairwise-disjoint attribute-value sets (domains), and define the
hypersoft parameter space
C := A1 × A2 × · · · × Ak .
A finite set of criteria-tuples is a subset
Λ = {λ1 , . . . , λm } ⊆ C,
where each λj = (a1j , . . . , akj ) specifies a combined multi-attribute value tuple.
Definition 5.2.2 (Numeric hypersoft evaluation). A numeric hypersoft evaluation is a map
x : U × Λ → R≥0 ,
(ui , λj ) 7→ xij .
Equivalently, x is the hypersoft decision matrix X = (xij ) ∈ Rn×m
≥0 .
(Crisp hypersoft set as a special case.) If one is given a crisp hypersoft set G : Λ → P(U ), then
it induces a numeric evaluation via
xij := 1{ui ∈G(λj )} ∈ {0, 1}.
More generally, if evaluations are given in a graded/uncertain hypersoft form (fuzzy, neutrosophic, picture fuzzy, interval-valued, etc.), one first applies a score map Score : (grade object) →
R≥0 and sets xij := Score(grade of ui at λj ).
Definition 5.2.3 (HyperSoft TOPSIS). Assume the hypersoft decision matrix X = (xij ) ∈
Rn×m
is given. Let w = (w1 , . . . , wm ) be criterion weights with
≥0
wj &gt; 0,
m
X
wj = 1.
j=1
Let B ⊆ {1, . . . , m} be the index set of benefit criteria and C ⊆ {1, . . . , m} be the index set of
cost criteria, with B ∪ C = {1, . . . , m} and B ∩ C = ∅.
(1) Vector normalization. For each j ∈ {1, . . . , m} define
v
u n
uX
dj := t
x2ij .
i=1
113


# Page. 115

![Page Image](https://bcdn.docswell.com/page/57GLVR16EL.jpg)

Chapter 5. Soft Decision-Making
The normalized matrix R = (rij ) is
x
 ij , dj &gt; 0,
dj
rij :=
 0,
dj = 0.
(2) Weighted normalized matrix. Define V = (vij ) by
vij := wj rij .
(3) Hypersoft positive/negative ideal solutions. For each criterion j , define the ideal
values
(
(
max1≤i≤n vij , j ∈ B,
min1≤i≤n vij , j ∈ B,
+
−
vj :=
vj :=
min1≤i≤n vij , j ∈ C,
max1≤i≤n vij , j ∈ C.
−
+
).
) and A− := (v1− , . . . , vm
Let A+ := (v1+ , . . . , vm
(4) Separation measures. For each alternative ui , define the (Euclidean) distances to the
ideals:
v
v
uX
uX
m
um
u
+ 2
−
+
t
Si := t (vij − vj− )2 .
Si :=
(vij − vj ) ,
j=1
j=1
(5) Closeness coefficient and ranking. Define the hypersoft TOPSIS closeness coefficient

S−


 + i − , Si+ + Si− &gt; 0,
Si + Si
Ci :=

1

 ,
Si+ + Si− = 0,
2
and rank alternatives by decreasing Ci :
up &lt; uq
⇐⇒
Cp ≥ Cq .
Any total order refining &lt; (e.g., by tie-breaking rules) is called a HyperSoft TOPSIS ranking.
Definition 5.2.4 (SuperHyperSoft parameter space and criteria family). Let U = {u1 , . . . , un }
be a finite set of alternatives. Let A1 , . . . , Ak be pairwise-disjoint attribute-value sets (k ≥ 1).
Define the superhypersoft parameter space
C SH := P(A1 ) × P(A2 ) × · · · × P(Ak ),
whose elements γ = (α1 , . . . , αk ) are tuples of value-subsets αt ⊆ At . A finite set of super-criteria
is a subset
Γ = {γ1 , . . . , γm } ⊆ C SH .
114


# Page. 116

![Page Image](https://bcdn.docswell.com/page/4EQY6VD2JP.jpg)

Chapter 5. Soft Decision-Making
Definition 5.2.5 (Numeric superhypersoft evaluation and decision matrix). A numeric superhypersoft evaluation (on Γ) is a map
xSH : U × Γ → R≥0 ,
(ui , γj ) 7→ xSH
ij .
n×m
Equivalently, xSH is represented by the superhypersoft decision matrix X SH = (xSH
ij ) ∈ R≥0 .
(Crisp/graded inputs.) If the raw information is given as a (crisp) SuperHyperSoft set F : Γ →
P(U ), one may take xSH
ij := 1{ui ∈F (γj )} . If the raw information is graded (fuzzy, neutrosophic,
interval-valued, etc.), assume a fixed score map Score to R≥0 and set xSH
ij := Score(grade of ui under γj ).
SH
Definition 5.2.6 (SuperHyperSoft TOPSIS). Assume the superhypersoft decision
Pm matrix X =
n×m
SH
(xij ) ∈ R≥0 is given. Let w = (w1 , . . . , wm ) be weights with wj &gt; 0 and j=1 wj = 1. Let
B (benefit) and C (cost) be a partition of {1, . . . , m}.
(1) Vector normalization. For each j let
v
u n
uX
SH
d
:= t (xSH )2 ,
ij
j
i=1
 SH

 xij , dSH &gt; 0,
j
SH
rij :=
dSH
j

 0,
dSH
= 0.
j
SH
Let RSH = (rij
).
SH
) by
(2) Weighted normalized matrix. Define V SH = (vij
SH
SH
vij
:= wj rij
.
(3) Ideal solutions. For each j define
(
SH
max1≤i≤n vij
, j ∈ B,
SH,+
vj
:=
SH
min1≤i≤n vij , j ∈ C,
vjSH,−
(
:=
SH
min1≤i≤n vij
, j ∈ B,
SH
max1≤i≤n vij
, j ∈ C.
SH,−
SH,+
) and ASH,− = (v1SH,− , . . . , vm
).
Let ASH,+ = (v1SH,+ , . . . , vm
(4) Separation measures. For each alternative ui set
v
v
uX
uX
m
u m SH
u

2
2
SH
SiSH,+ := t
vij
− vjSH,+ ,
SiSH,− := t
vij − vjSH,− .
j=1
j=1
(5) Closeness coefficient and ranking. Define

S SH,−


 SH,+i
, SiSH,+ + SiSH,− &gt; 0,
SH,−
SH
Si
+ Si
Ci :=

1

 ,
SiSH,+ + SiSH,− = 0,
2
and rank alternatives by decreasing CSH
i .
115


# Page. 117

![Page Image](https://bcdn.docswell.com/page/KJ4W4MZP71.jpg)

Chapter 5. Soft Decision-Making
Definition 5.2.7 (Singleton embedding of hypersoft parameters). Let C H := A1 × · · · × Ak be
the hypersoft parameter space. Define the map
ι : C H → C SH ,
ι(a1 , . . . , ak ) := ({a1 }, . . . , {ak }).
For Λ = {λ1 , . . . , λm } ⊆ C H , set
Γ := ι(Λ) = {ι(λ1 ), . . . , ι(λm )} ⊆ C SH .
Theorem 5.2.8 (SuperHyperSoft TOPSIS strictly generalizes HyperSoft TOPSIS). Let U be a
set of alternatives and let Λ = {λ1 , . . . , λm } ⊆ A1 × · · · × Ak be a hypersoft criterion family. Let
xH : U × Λ → R≥0 be a numeric hypersoft evaluation and define Γ = ι(Λ) as in Theorem 5.2.7.
Define a superhypersoft evaluation xSH : U × Γ → R≥0 by
xSH (u, ι(λ)) := xH (u, λ),
(u ∈ U, λ ∈ Λ).
Fix the same weight vector w and the same benefit/cost partition (B, C) for both methods.
Then the SuperHyperSoft TOPSIS closeness coefficients equal the HyperSoft TOPSIS closeness
coefficients:
CSH
= CH
for all i ∈ {1, . . . , n},
i
i
and hence both methods produce the same ranking of alternatives.
Proof. Index Λ = {λ1 , . . . , λm } and Γ = {γ1 , . . . , γm } with γj = ι(λj ). By definition of xSH ,
the two decision matrices coincide entrywise:
SH
xSH
(ui , γj ) = xSH (ui , ι(λj )) = xH (ui , λj ) = xH
ij = x
ij .
Therefore, for each column j the normalization denominators are equal:
v
v
u n
u n
X
uX
u
H
SH
SH
2
dj = t (xij )2 = t (xH
ij ) = dj .
i=1
i=1
SH
H
Hence rij
= rij
for all i, j , and thus also
SH
SH
H
H
vij
= wj rij
= wj rij
= vij
.
Since the weighted normalized matrices coincide, their componentwise maxima/minima over i
coincide as well; thus ASH,+ = AH,+ and ASH,− = AH,− . Consequently, the separation measures
are identical for each i:
v
v
uX
uX
m
u
um H
SH
SiSH,± = t (vij
− vjSH,± )2 = t (vij
− vjH,± )2 = SiH,± .
j=1
j=1
Plugging into the closeness coefficient formula yields CSH
= CH
i
i for all i. Therefore the induced
rankings coincide.
5.3
Soft, HyperSoft, and SuperHyperSoft AHP
AHP is an MCDM method using pairwise comparisons to derive ratio-scale weights, aggregate
priorities hierarchically, and rank alternatives [324–327]. Soft AHP applies pairwise comparison
matrices to soft parameters, derives criteria weights and alternative priorities, then aggregates
them into rankings. HyperSoft AHP extends Soft AHP by using multiple attribute value tuples
as criteria, enabling richer parameterized pairwise evaluations for alternatives. SuperHyperSoft
AHP generalizes HyperSoft AHP by allowing set valued attribute choices per criterion tuple,
improving decision flexibility under uncertainty contexts.
116


# Page. 118

![Page Image](https://bcdn.docswell.com/page/LE1Y48RX7G.jpg)

Chapter 5. Soft Decision-Making
Definition 5.3.1 (Positive reciprocal (pairwise-comparison) matrix). Let n ∈ N. A matrix
A = (aij ) ∈ Rn×n is called a positive reciprocal matrix if
aij &gt; 0,
aii = 1,
aij =
1
aji
(1 ≤ i, j ≤ n).
We denote by Rn the set of all positive reciprocal matrices of size n.
Definition 5.3.2 (Priority vector operator). Let A ∈ Rn . Since A is a positive matrix, by
the Perron–Frobenius theorem A has a largest eigenvalue λmax (A) &gt; 0 with a strictly positive
eigenvector. Define the priority vector π(A) ∈ Rn&gt;0 as any Perron eigenvector normalized to sum
1:
n
X
A π(A) = λmax (A) π(A),
πi (A) = 1.
i=1
(When the Perron eigenvalue is simple, π(A) is unique.)
Remark 5.3.3 (Consistency). A matrix A ∈ Rn is (multiplicatively) consistent iff aij ajk = aik
for all i, j, k . In that case there exists w ∈ Rn&gt;0 such that aij = wi /wj , and then π(A) recovers
w up to normalization.
Definition 5.3.4 (Soft AHP decision instance). Let U = {u1 , . . . , un } be a finite set of alternatives and let E be a set of parameters (criteria). Fix a soft parameter set S = {e1 , . . . , em } ⊆ E .
A Soft AHP decision instance is a tuple
SAHP = (U, E, S, A(0) , {A(e) }e∈S ),
where
(i) A(0) ∈ Rm is the criteria pairwise-comparison matrix indexed by S ;
(ii) for each e ∈ S , A(e) ∈ Rn is the alternative pairwise-comparison matrix under criterion e.
The criteria weight vector is w = π(A(0) ) ∈ Rm
&gt;0 , and the local alternative weight under ej is
p(j) = π(A(ej ) ) ∈ Rn&gt;0 .
The global priority (overall score) of alternatives is the vector
P :=
m
X
wj p(j) ∈ Rn&gt;0 ,
j=1
so Pi =
m
X
j=1
A Soft AHP ranking is any ordering of U that is nonincreasing in Pi .
117
(j)
wj p i .


# Page. 119

![Page Image](https://bcdn.docswell.com/page/GEWGXZ1KJ2.jpg)

Chapter 5. Soft Decision-Making
Definition 5.3.5 (HyperSoft AHP decision instance). Let U = {u1 , . . . , un } be alternatives.
Let A1 , . . . , Ak be pairwise-disjoint attribute-value sets and define the hypersoft parameter space
C H = A1 × · · · × Ak . Fix a finite criteria-tuples family
Λ = {λ1 , . . . , λm } ⊆ C H .
A HyperSoft AHP decision instance is a tuple
HAHP = (U, A1 , . . . , Ak , Λ, A(0) , {A(λ) }λ∈Λ ),
where A(0) ∈ Rm compares the criteria-tuples in Λ, and for each λ ∈ Λ, A(λ) ∈ Rn compares
alternatives under criterion-tuple λ.
(j)
Define w = π(A(0) ) ∈ Rm
= π(A(λj ) ) ∈ Rn&gt;0 . The global priority is
&gt;0 and p
P :=
m
X
wj p(j) ∈ Rn&gt;0 ,
j=1
and alternatives are ranked by nonincreasing Pi .
Theorem 5.3.6 (HyperSoft AHP generalizes Soft AHP). Every Soft AHP decision instance
can be realized as a HyperSoft AHP decision instance, and under this realization both methods
produce identical global priorities and rankings.
Proof. Let SAHP = (U, E, S, A(0) , {A(e) }e∈S ) with S = {e1 , . . . , em }. Set k = 1 and let A1 :=
E . Then C H = A1 = E . Define Λ := S ⊆ E and identify λj ≡ ej .
Now define a HyperSoft AHP instance by taking the same criteria matrix A(0) ∈ Rm and, for
each λ = e ∈ Λ, set A(λ) := A(e) . Then by construction, the criteria weight vector w = π(A(0) )
is the same in both models, and each local vector satisfies
π(A(λj ) ) = π(A(ej ) ).
Hence the global priority vectors coincide:
PH =
m
X
wj π(A(λj ) ) =
j=1
m
X
wj π(A(ej ) ) = P S .
j=1
Therefore the induced rankings are identical.
Definition 5.3.7 (SuperHyperSoft AHP decision instance). Let U = {u1 , . . . , un } be alternatives. Let A1 , . . . , Ak be pairwise-disjoint attribute-value sets and define the superhypersoft
parameter space
C SH = P(A1 ) × P(A2 ) × · · · × P(Ak ).
Fix a finite super-criteria family
Γ = {γ1 , . . . , γm } ⊆ C SH ,
γj = (α1j , . . . , αkj ), αtj ⊆ At .
118


# Page. 120

![Page Image](https://bcdn.docswell.com/page/47ZL61PNJ3.jpg)

Chapter 5. Soft Decision-Making
A SuperHyperSoft AHP decision instance is
SHAHP = (U, A1 , . . . , Ak , Γ, A(0) , {A(γ) }γ∈Γ ),
where A(0) ∈ Rm compares the elements of Γ, and for each γ ∈ Γ, A(γ) ∈ Rn compares
alternatives under super-criterion γ .
(j)
Let w = π(A(0) ) ∈ Rm
= π(A(γj ) ) ∈ Rn&gt;0 . The global priority is
&gt;0 and p
m
X
P :=
wj p(j) ∈ Rn&gt;0 ,
j=1
and alternatives are ranked by nonincreasing Pi .
Definition 5.3.8 (Singleton embedding). Let C H = A1 × · · · × Ak and C SH = P(A1 ) × · · · ×
P(Ak ). Define
ι : C H → C SH ,
ι(a1 , . . . , ak ) := ({a1 }, . . . , {ak }).
For Λ = {λ1 , . . . , λm } ⊆ C H , set Γ := ι(Λ) and index Γ = {γ1 , . . . , γm } by γj = ι(λj ).
Theorem 5.3.9 (SuperHyperSoft AHP generalizes HyperSoft AHP). Every HyperSoft AHP
decision instance can be realized as a SuperHyperSoft AHP decision instance via the singleton
embedding ι. Under this realization, both methods yield identical global priorities and rankings.
Proof. Let HAHP = (U, A1 , . . . , Ak , Λ, A(0) , {A(λ) }λ∈Λ ) with Λ = {λ1 , . . . , λm }. Define Γ =
ι(Λ) and γj = ι(λj ) as in Theorem 5.3.8. Construct a SuperHyperSoft AHP instance by taking
the same criteria matrix A(0) ∈ Rm and defining
A(γj ) := A(λj )
(j = 1, . . . , m).
Then the criteria weight vector is the same w = π(A(0) ) in both models, and each local alternative
priority vector satisfies
π(A(γj ) ) = π(A(λj ) ).
Hence the global priority vectors coincide:
P
SH
=
m
X
(γj )
wj π(A
)=
m
X
j=1
wj π(A(λj ) ) = P H .
j=1
Therefore the induced rankings are identical.
5.4 Soft, HyperSoft, and SuperHyperSoft VIKOR
VIKOR is an MCDM method ranking alternatives by compromise using group utility and individual regret, controlled by parameter v [328–331]. Soft VIKOR ranks alternatives using soft
parameters, computes best worst values, group utility S, individual regret R, compromise Q
index. HyperSoft VIKOR replaces single parameters with attribute value tuples, then applies
VIKOR normalization, S and R aggregation, Q ranking procedure. SuperHyperSoft VIKOR
allows set valued attribute choices per criterion, embedding HyperSoft via singleton sets, preserving S R Q outputs exactly.
119


# Page. 121

![Page Image](https://bcdn.docswell.com/page/YJ6W2LM9JV.jpg)

Chapter 5. Soft Decision-Making
Notation 5.4.1 (Alternatives, criteria-orientation, and weights). Let U = {u1 , . . . , un } be a
finite set of alternatives and let m ∈ N. A criterion (parameter) will always be equipped with an
orientation
τ ∈ {ben, cost},
meaning that larger values are preferred for ben and smaller values are preferred for cost.
m
A weight
Pm vector on a finite criterion-family C = {c1 , . . . , cm } is w = (w1 , . . . , wm ) ∈ [0, 1]
with j=1 wj = 1.
Definition 5.4.2 (Ideal best/worst values). Let C = {c1 , . . . , cm } be a criterion-family with
orientations τj ∈ {ben, cost}. Let f : U × C → R be an evaluation function and write fij :=
f (ui , cj ).
For each j ∈ {1, . . . , m} define the ideal best value fj∗ and ideal worst value fj− by


 max1≤i≤n fij , min1≤i≤n fij , τj = ben,
∗
−
(fj , fj ) :=

 min
1≤i≤n fij , max1≤i≤n fij , τj = cost.
Definition 5.4.3 (Normalized loss (distance from the ideal)). Under the hypotheses of Theorem 5.4.2, define the normalized loss dij ∈ [0, 1] by


0,
fj∗ = fj− ,




∗

 fj − fij
−
∗
− , τj = ben and fj 6= fj ,
∗
dij :=
f
−
f
j
j



fij − fj∗

−
∗


 f − − f ∗ , τj = cost and fj 6= fj .
j
j
Equivalently, dij = 0 iff ui attains the ideal best on criterion cj (or the criterion is constant).
Definition 5.4.4P(Group utility and individual regret). Let dij be as in Theorem 5.4.3 and let
w ∈ [0, 1]m with j wj = 1. Define for each alternative ui :
Si :=
m
X
wj dij
and
j=1
Set
S ∗ := min Si ,
1≤i≤n
S − := max Si ,
Ri := max (wj dij ).
1≤j≤m
R∗ := min Ri ,
1≤i≤n
1≤i≤n
R− := max Ri .
1≤i≤n
Definition 5.4.5 (Compromise index Q). Let v ∈ [0, 1] be fixed (often v = 1/2). With
Si , Ri , S ∗ , S − , R∗ , R− as in Theorem 5.4.4, define
Qi := v
Si − S ∗
Ri − R ∗
+
(1
−
v)
,
S− − S∗
R− − R∗
using the convention that 00 := 0 (so if S − = S ∗ then the first fraction is set to 0, and similarly
for R− = R∗ ).
120


# Page. 122

![Page Image](https://bcdn.docswell.com/page/GJ5M21ZDJ4.jpg)

Chapter 5. Soft Decision-Making
Definition 5.4.6 (Soft VIKOR decision instance and solution). Let E be a parameter set and
let S = {e1 , . . . , em } ⊆ E be a finite soft-parameter set. A Soft VIKOR decision instance is a
tuple
SVIKOR = (U, E, S, τ, w, f, v),
where
(i) τ : S → {ben, cost} assigns an orientation τj := τ (ej ) to each parameter;
(ii) w = (w1 , . . . , wm ) ∈ [0, 1]m with
j=1 wj = 1 is the criterion-weight vector;
Pm
(iii) f : U × S → R is an evaluation function (decision matrix) with fij := f (ui , ej );
(iv) v ∈ [0, 1] is the compromise coefficient.
Compute fj∗ , fj− by Theorem 5.4.2, then dij by Theorem 5.4.3, then Si , Ri by Theorem 5.4.4, and
finally Qi by Theorem 5.4.5. A Soft VIKOR ranking is any ordering of U that is nondecreasing
in Qi .
(Optionally) Let u(1) and u(2) be the first and second alternatives under the Q-ordering. Define
1
DQ := n−1
. If (a) Q(u(2) ) − Q(u(1) ) ≥ DQ (acceptable advantage) and (b) u(1) is also best by
S or by R (acceptable stability), then u(1) is called the (unique) compromise solution. Otherwise
one may output a compromise set consisting of the top few Q-alternatives.
Definition 5.4.7 (HyperSoft VIKOR decision instance). Let A1 , . . . , Ak be (pairwise-disjoint)
attribute-value sets and let
C H := A1 × · · · × Ak
be the hypersoft parameter domain. Fix a finite criteria-tuples family Λ = {λ1 , . . . , λm } ⊆ C H .
A HyperSoft VIKOR decision instance is a tuple
HVIKOR = (U, A1 , . . . , Ak , Λ, τ, w, f, v),
where τ : Λ → {ben, cost}, w ∈ [0, 1]m with
j wj = 1, f : U × Λ → R, and v ∈ [0, 1].
P
With indices fij := f (ui , λj ) and τj := τ (λj ), define fj∗ , fj− , dij , Si , Ri , and Qi exactly as in
Theorems 5.4.2 to 5.4.5. The ranking is nondecreasing in Qi .
Theorem 5.4.8 (HyperSoft VIKOR generalizes Soft VIKOR). Every Soft VIKOR decision
instance can be realized as a HyperSoft VIKOR decision instance. Under this realization, all
computed quantities (fj∗ , fj− , dij , Si , Ri , Qi ) coincide, hence the rankings (and compromise solutions/sets) coincide.
121


# Page. 123

![Page Image](https://bcdn.docswell.com/page/9E2941PM7R.jpg)

Chapter 5. Soft Decision-Making
Proof. Let SVIKOR = (U, E, S, τ, w, f, v) with S = {e1 , . . . , em }. Set k = 1 and define A1 := E ,
so C H = A1 = E . Let Λ := S ⊆ E and identify λj ≡ ej .
Define the HyperSoft instance by keeping the same v and the same weight vector w, setting
τ (λj ) := τ (ej ), and defining f (u, λ) := f (u, e) under the identification λ ≡ e.
Then for every i, j we have identical entries fij in both models, hence the best/worst values fj∗ , fj−
from Theorem 5.4.2 coincide, and therefore the normalized losses dij from Theorem 5.4.3 coincide.
With the same w, this forces Si and Ri from Theorem 5.4.4 to coincide, and consequently Qi from
Theorem 5.4.5 coincides. Thus the induced rankings and any compromise outputs coincide.
Definition 5.4.9 (SuperHyperSoft VIKOR decision instance). Let A1 , . . . , Ak be (pairwisedisjoint) attribute-value sets and define the superhypersoft domain
C SH := P(A1 ) × P(A2 ) × · · · × P(Ak ).
Fix a finite family Γ = {γ1 , . . . , γm } ⊆ C SH of set-valued criteria-tuples.
A SuperHyperSoft VIKOR decision instance is a tuple
SHVIKOR = (U, A1 , . . . , Ak , Γ, τ, w, f, v),
P
where τ : Γ → {ben, cost}, w ∈ [0, 1]m with j wj = 1, f : U × Γ → R, and v ∈ [0, 1].
Define fj∗ , fj− , dij , Si , Ri , and Qi exactly as in Theorems 5.4.2 to 5.4.5 with γj in place of cj .
Rank alternatives nondecreasingly by Qi .
Definition 5.4.10 (Singleton embedding). Let C H := A1 × · · · × Ak and C SH := P(A1 ) × · · · ×
P(Ak ). Define
ι : C H → C SH ,
ι(a1 , . . . , ak ) := ({a1 }, . . . , {ak }).
For Λ = {λ1 , . . . , λm } ⊆ C H set Γ := ι(Λ), indexed as γj := ι(λj ).
Theorem 5.4.11 (SuperHyperSoft VIKOR generalizes HyperSoft VIKOR). Every HyperSoft
VIKOR decision instance can be realized as a SuperHyperSoft VIKOR decision instance via the
singleton embedding ι. Under this realization, all computed quantities (fj∗ , fj− , dij , Si , Ri , Qi )
coincide, hence the rankings coincide.
Proof. Let HVIKOR = (U, A1 , . . . , Ak , Λ, τ, w, f, v) with Λ = {λ1 , . . . , λm } ⊆ A1 × · · · × Ak .
Let Γ = ι(Λ) and γj = ι(λj ) as in Theorem 5.4.10.
Define the SuperHyperSoft instance by keeping the same v and w, setting
τ (γj ) := τ (λj ),
f (u, γj ) := f (u, λj ) (u ∈ U, j = 1, . . . , m).
Then for every i, j , the entries fij coincide under the identification γj ↔ λj . Therefore fj∗ , fj−
coincide (same extrema over the same numbers), hence dij coincide. With identical weights w,
the aggregates Si and Ri coincide, and thus Qi coincides. Consequently the induced rankings
coincide.
122


# Page. 124

![Page Image](https://bcdn.docswell.com/page/D7Y4MZ5PEM.jpg)

Chapter 6
Conclusion
In this book, we provided a survey-style overview of soft set theory and its major developments.
We expect that the concepts reviewed here will stimulate further research, especially on algorithm
design and applications in machine learning and related areas.
123


# Page. 125

![Page Image](https://bcdn.docswell.com/page/VENYW3NMJ8.jpg)



# Page. 126

![Page Image](https://bcdn.docswell.com/page/Y79PX9RWE3.jpg)

Disclaimer
Funding
This study did not receive any financial or external support from organizations or individuals.
Acknowledgments
We extend our sincere gratitude to everyone who provided insights, inspiration, and assistance
throughout this research. We particularly thank our readers for their interest and acknowledge
the authors of the cited works for laying the foundation that made our study possible. We
also appreciate the support from individuals and institutions that provided the resources and
infrastructure needed to produce and share this book. Finally, we are grateful to all those who
supported us in various ways during this project.
Data Availability
This research is purely theoretical, involving no data collection or analysis. We encourage future researchers to pursue empirical investigations to further develop and validate the concepts
introduced here.
Ethical Approval
As this research is entirely theoretical in nature and does not involve human participants or
animal subjects, no ethical approval is required.
Use of Generative AI and AI-Assisted Tools
I use generative AI and AI-assisted tools for tasks such as English grammar checking, and I do
not employ them in any way that violates ethical standards.
125


# Page. 127

![Page Image](https://bcdn.docswell.com/page/G78D29WR7D.jpg)

Chapter 6. Conclusion
Conflicts of Interest
The authors confirm that there are no conflicts of interest related to the research or its publication.
Disclaimer
This work presents theoretical concepts that have not yet undergone practical testing or validation. Future researchers are encouraged to apply and assess these ideas in empirical contexts.
While every effort has been made to ensure accuracy and appropriate referencing, unintentional
errors or omissions may still exist. Readers are advised to verify referenced materials on their
own. The views and conclusions expressed here are the authors’ own and do not necessarily
reflect those of their affiliated organizations.
126


# Page. 128

![Page Image](https://bcdn.docswell.com/page/L7LM2WN2JR.jpg)

Appendix (List of Tables)
2.1
2.2
2.3
2.4
2.5
2.6
Concise comparison of Soft sets, HyperSoft sets, and SuperHyperSoft sets. . . . .
Concise comparison between a SuperHyperSoft set and an (m, n)-SuperHyperSoft
set on a universe U . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Soft Set vs. ContraSoft Set (concise comparison) . . . . . . . . . . . . . . . . . .
Concise comparison of classical soft sets and probabilistic soft sets over a finite
universe U . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Concise comparison between a classical soft set and a D-soft set over a (finite)
universe U . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Concise comparison between Soft Sets and GraphicSoft Sets over a universe U . .
10
12
19
41
43
60
*
127


# Page. 129

![Page Image](https://bcdn.docswell.com/page/4EMY89X9EW.jpg)



# Page. 130

![Page Image](https://bcdn.docswell.com/page/PER95GN9J9.jpg)

Bibliography
[1] Pradip Kumar Maji, Ranjit Biswas, and A Ranjan Roy. Soft set theory. Computers &amp; mathematics with
applications, 45(4-5):555–562, 2003.
[2] Dmitriy Molodtsov. Soft set theory-first results. Computers &amp; mathematics with applications, 37(45):19–31, 1999.
[3] Thomas Jech. Set theory: The third millennium edition, revised and expanded. Springer, 2003.
[4] Lotfi A Zadeh. Fuzzy sets. Information and control, 8(3):338–353, 1965.
[5] Krassimir T Atanassov.
39(5):5981–5986, 2020.
Circular intuitionistic fuzzy sets.
Journal of Intelligent &amp; Fuzzy Systems,
[6] Vicenç Torra. Hesitant fuzzy sets. International journal of intelligent systems, 25(6):529–539, 2010.
[7] Bui Cong Cuong. Picture fuzzy sets. Journal of Computer Science and Cybernetics, 30:409, 2015.
[8] Said Broumi, Mohamed Talea, Assia Bakali, and Florentin Smarandache. Single valued neutrosophic
graphs. Journal of New theory, 10:86–101, 2016.
[9] Haibin Wang, Florentin Smarandache, Yanqing Zhang, and Rajshekhar Sunderraman. Single valued neutrosophic sets. Infinite study, 2010.
[10] R Radha, A Stanis Arul Mary, and Florentin Smarandache. Quadripartitioned neutrosophic pythagorean
soft set. International Journal of Neutrosophic Science (IJNS) Volume 14, 2021, page 11, 2021.
[11] Rama Mallick and Surapati Pramanik. Pentapartitioned neutrosophic set and its properties, volume 36.
Infinite Study, 2020.
[12] Lin Wei. An integrated decision-making framework for blended teaching quality evaluation in college
english courses based on the double-valued neutrosophic sets. J. Intell. Fuzzy Syst., 45:3259–3266, 2023.
[13] Hu Zhao and Hong-Ying Zhang. On hesitant neutrosophic rough set over two universes and its application.
Artificial Intelligence Review, 53:4387–4406, 2020.
[14] Florentin Smarandache. Plithogenic set, an extension of crisp, fuzzy, intuitionistic fuzzy, and neutrosophic
sets-revisited. Infinite study, 2018.
[15] Florentin Smarandache.
Extension of HyperGraph to n-SuperHyperGraph and to Plithogenic nSuperHyperGraph, and Extension of HyperAlgebra to n-ary (Classical-/Neutro-/Anti-) HyperAlgebra. Infinite Study, 2020.
[16] Feng Feng, Xiaoyan Liu, Violeta Leoreanu-Fotea, and Young Bae Jun. Soft sets and soft rough sets.
Information Sciences, 181(6):1125–1137, 2011.
[17] Lotfi A Zadeh. A note on z-numbers. Information sciences, 181(14):2923–2932, 2011.
[18] Florentin Smarandache. A unifying field in logics: Neutrosophic logic. In Philosophy, pages 1–141. American Research Press, 1999.
[19] Naeem Jan, Tahir Mahmood, Lemnaouar Zedam, and Zeeshan Ali. Multi-valued picture fuzzy soft sets
and their applications in group decision-making problems. Soft Computing, 24:18857 – 18879, 2020.
[20] Eugenio Aguirre and Antonio González. Fuzzy behaviors for mobile robot navigation: design, coordination
and fusion. International Journal of Approximate Reasoning, 25(3):255–289, 2000.
[21] Alberto Fernandez, Francisco Herrera, Oscar Cordon, Maria Jose del Jesus, and Francesco Marcelloni.
Evolutionary fuzzy systems for explainable artificial intelligence: Why, when, what for, and where to?
IEEE Computational intelligence magazine, 14(1):69–81, 2019.
[22] Yasmine M Ibrahim, Reem Essameldin, and Saad M Darwish. An adaptive hate speech detection approach
using neutrosophic neural networks for social media forensics. Computers, Materials &amp; Continua, 79(1),
2024.
[23] OM Khaled, AA Salama, Mostafa Herajy, MM El-Kirany, Huda E Khalid, Ahmed K Essa, and Ramiz
Sabbagh. A novel approach for cyber-attack detection in iot networks with neutrosophic neural networks.
Neutrosophic Sets and Systems, 86(1):48, 2025.
[24] Florentin Smarandache. New types of soft sets “hypersoft set, indetermsoft set, indetermhypersoft set, and
treesoft set”: an improved version. Infinite Study, 2023.
129


# Page. 131

![Page Image](https://bcdn.docswell.com/page/P7XQKXN3EX.jpg)

Bibliography
[25] Muhammad Ihsan, Atiqe Ur Rahman, and Muhammad Haris Saeed. Hypersoft expert set with application
in decision making for recruitment process. In Neutrosophic Sets and Systems, 2021.
[26] Florentin Smarandache. Extension of soft set to hypersoft set, and then to plithogenic hypersoft set.
Neutrosophic sets and systems, 22(1):168–170, 2018.
[27] Oswaldo Edison García Brito, Andrea Sofía Ribadeneira Vacacela, Carmen Hortensia Sánchez Burneo, and
Mónica Cecilia Jimbo Galarza. English for specific purposes in the medical sciences to strengthen the professional profile of the higher education medicine student: a knowledge representation using superhypersoft
sets. Neutrosophic Sets and Systems, 74(1):10, 2024.
[28] Mona Mohamed, Alaa Elmor, Florentin Smarandache, and Ahmed A Metwaly. An efficient superhypersoft
framework for evaluating llms-based secure blockchain platforms. Neutrosophic Sets and Systems, 72:1–21,
2024.
[29] T Kiruthika, M Karpagadevi, S Krishnaprakash, and G Deepa. Superhypersoft sets using python and
its applications in neutrosophic superhypersoft sets under topsis method. Neutrosophic Sets and Systems,
91:586–616, 2025.
[30] Florentin Smarandache. Foundation of the superhypersoft set and the fuzzy extension superhypersoft set:
A new vision. Neutrosophic Systems with Applications, 11:48–51, 2023.
[31] Ali Alqazzaz and Karam M Sallam. Evaluation of sustainable waste valorization using treesoft set with
neutrosophic sets. Neutrosophic Sets and Systems, 65(1):9, 2024.
[32] Edwin Collazos Paucar, Jeri G Ramón Ruffner de Vega, Efrén S Michue Salguedo, Agustina C TorresRodríguez, and Patricio A Santiago-Saturnino. Analysis using treesoft set of the strategic development
plan for extreme poverty municipalities. Neutrosophic Sets and Systems, 69(1):3, 2024.
[33] G Dhanalakshmi, S Sandhiya, Florentin Smarandache, et al. Selection of the best process for desalination
under a treesoft set environment using the multi-criteria decision-making method. International Journal
of Neutrosophic Science, 23(3):140–40, 2024.
[34] Mona Gharib, Fatima Rajab, and Mona Mohamed. Harnessing tree soft set and soft computing techniques’
capabilities in bioinformatics: Analysis, improvements, and applications. Neutrosophic sets and systems,
61:579–597, 2023.
[35] Takaaki Fujita. Polytree-soft sets and polyforest-soft sets: A directed acyclic framework for soft set
modeling. HyperSoft Set Methods in Engineering, 4:11–23, 2025.
[36] Takaaki Fujita, Arif Mehmood, Ajoy Kanti Das, Suman Das, Volkan Duran, Arkan A Ghaib, and Talal Al-Hawary. Multitree-soft, pseudotree-soft set, hypertree-soft, andtree-to-tree-soft set. Neutrosophic
Computing and Machine Learning. ISSN 2574-1101, 42:65–87, 2026.
[37] Florentin Smarandache. New types of soft sets: Hypersoft set, indetermsoft set, indetermhypersoft set,
and treesoft set. International Journal of Neutrosophic Science, 2023.
[38] Hairong Luo. Forestsoft set approach for estimating innovation and entrepreneurship education in universities through a hierarchical and uncertainty-aware analytical framework. Neutrosophic Sets and Systems,
86(1):21, 2025.
[39] Takaaki Fujita, Ajoy Kanti Das, Arif Mehmood, Suman Das, and Volkan Duran. Decision analytics
applications of the relationship between treesoft graphs and forestsoft graphs. Applied Decision Analytics,
2(1):73–92, 2026.
[40] Takaaki Fujita and Florentin Smarandache. Quantum-TreeSoft Set and Quantum-ForestSoft Set. Infinite
Study, 2025.
[41] P Sathya, Nivetha Martin, and Florentine Smarandache. Plithogenic forest hypersoft sets in plithogenic
contradiction based multi-criteria decision making. Neutrosophic Sets and Systems, 73:668–693, 2024.
[42] Viviana del Rocío Marfetan Marfetan, Lesly Gissela Tipanguano Chicaiza, Styven Andrés Pila Chicaiza,
and Estephany Monserrath Ojeda Sanchez. Classification of cases of animal abuse in ecuador using indetermsoft and c4. 5 algorithms. Neutrosophic Sets and Systems, 92:121–133, 2025.
[43] Erick González Caballero, Ketty Marilú Moscoso-Paucarchuco, Noel Batista Hernandez, Lorenzo Jovanny Cevallos Torres, Maikel Leyva, and Victor Gustavo Gómez Rodríguez. Algorithms of designing
decision trees from indeterm soft sets. In Neutrosophic and Plithogenic Inventory Models for Applied
Mathematics, pages 561–586. IGI Global Scientific Publishing, 2025.
[44] Wei Wei and Pingting Peng. Weighted indetermsoft set for prioritized decision-making with indeterminacy
and its application to green competitiveness evaluation in equipment manufacturing enterprises. Neutrosophic Sets and Systems, 85:1018–1026, 2025.
[45] Hai Yang and Cuijuan Lin. A recursive indetermtree soft set (rit-soft set) for dynamic and uncertain
performance evaluation in college competitive sports. Neutrosophic Sets and Systems, 85:874–886, 2025.
[46] Tao Shen and Chunmei Mao. Sustainability impact of online consumption behavior from the perspective of
digital empowerment: Indetermsoft set with application. Neutrosophic Sets and Systems, 82(1):24, 2025.
[47] Bhargavi Krishnamurthy and Sajjan G Shiva. Indetermsoft-set-based d* extra lite framework for resource
provisioning in cloud computing. Algorithms, 17(11):479, 2024.
130


# Page. 132

![Page Image](https://bcdn.docswell.com/page/37K95WNN7D.jpg)

Bibliography
[48] Florentin Smarandache. Introduction to SuperHyperAlgebra and Neutrosophic SuperHyperAlgebra. Infinite
Study, 2022.
[49] Yan Xu. A neutrosophic α-discounting indetermhypersoft framework for evaluating agricultural product
export trade quality under uncertainty. Neutrosophic Sets and Systems, 87:533–542, 2025.
[50] Lingling Chen. A comprehensive indetermhypersoft set model for evaluating university literature education
effectiveness: Integrating cultural context, argumentation skills, and dynamic progress. Neutrosophic Sets
and Systems, 87:295–309, 2025.
[51] Takaaki Fujita and Florentin Smarandache. An introduction to advanced soft set variants: Superhypersoft
sets, indetermsuperhypersoft sets, indetermtreesoft sets, bihypersoft sets, graphicsoft sets, and beyond.
Neutrosophic Sets and Systems, 82:817–843, 2025.
[52] Takaaki Fujita and Florentin Smarandache. Navigating Bipolar Indeterminacy: Bipolar IndetermSoft Sets
and Bipolar IndetermHyperSoft Sets for Knowledge Representation. Infinite Study, 2026.
[53] Takaaki Fujita, Raed Hatamleh, and Ahmed Salem Heilat. Contrasoft set and contrarough set with using
upside-down logic. Statistics, Optimization &amp; Information Computing, 2025.
[54] Vicenç Torra and Yasuo Narukawa. On hesitant fuzzy sets and decision. In 2009 IEEE international
conference on fuzzy systems, pages 1378–1382. IEEE, 2009.
[55] Yiwei Chen, Qiu Xie, Xiaoyu Ma, and Yuwei Li. Optimizing site selection for construction and demolition waste resource treatment plants using a hesitant neutrosophic set: a case study in xiamen, china.
Engineering Optimization, pages 1–22, 2024.
[56] Juanjuan Chen, Shenggang Li, Shengquan Ma, and Xueping Wang. m-polar fuzzy sets: an extension of
bipolar fuzzy sets. The scientific world journal, 2014(1):416530, 2014.
[57] V Rajam and N Rajesh. Multipolar neutrosophic subalgebras/ideals of up-algebras. International Journal
of Neutrosophic Science (IJNS), 23(4), 2024.
[58] Muhammad Saqlain, Muhammad Riaz, Natasha Kiran, Poom Kumam, and Miin-Shen Yang. Water quality
evaluation using generalized correlation coefficient for m-polar neutrosophic hypersoft sets. Neutrosophic
Sets and Systems, vol. 55/2023: An International Journal in Information Science and Engineering, page 58,
2024.
[59] M Sivakumar, Rabıyathul Basarıya, Abdul Rajak, M Senthil, T Vetriselvi, G Raja, and R Rajavarman.
Transforming arabic text analysis: Integrating applied linguistics with m-polar neutrosophic set mood
change and depression on social media. International Journal of Neutrosophic Science (IJNS), 25(2),
2025.
[60] Hind Y Saleh, Areen A Salih, Baravan A Asaad, and Ramadhan A Mohammed. Binary bipolar soft points
and topology on binary bipolar soft sets with their symmetric properties. Symmetry, 16(1):23, 2023.
[61] Asghar Khan, Muhammad Izhar, and Mohammed M. Khalaf. Generalised multi-fuzzy bipolar soft sets
and its application in decision making. J. Intell. Fuzzy Syst., 37:2713–2725, 2019.
[62] Maha M Saeed, Sagvan Y Musa, Baravan A Asaad, and Zanyar A Ameen. Pythagorean fuzzy n-bipolar
soft sets-based multi-criteria decision-making framework for sustainability evaluation and risk assessment
in manufacturing industries. Scientific Reports, 15(1):29648, 2025.
[63] Sagvan Y. Musa and Baravan A. Asaad. Topological structures via bipolar hypersoft sets. Journal of
Mathematics, 2022.
[64] Sagvan Y Musa and Baravan A Asaad. Mappings on bipolar hypersoft classes. Neutrosophic Sets and
Systems, 53(1):36, 2023.
[65] Sagvan Y Musa and Baravan A Asaad. A progressive approach to multi-criteria group decision-making:
N-bipolar hypersoft topology perspective. Plos one, 19(5):e0304016, 2024.
[66] T. Fujita and A. Mehmood. Extending classical uncertainty models via hyperpolar structures: Fuzzy,
neutrosophic, and soft set perspectives. Galoitica: J. Math. Struct. Appl., 12:24–39, 2025.
[67] Muhammad Saeed. An introduction to dynamic soft sets: A framework for modeling temporal uncertainty.
Available at SSRN 5820784, 2025.
[68] Muhammad Saeed, Fatima Razaq, and Muhammad Hassan. Dynamic soft set topology: A novel topological
framework incorporating evolving parameter structures, 2025.
[69] Muhammad Saeed, Fatima Razaq, Muhammad Hassan, and Dr Atiqe Ur Rahman. Dynamic soft graphs:
A unified framework for modeling time-indexed uncertainty in evolving networks, 2025.
[70] Himanshukumar R Patel and Vipul A Shah. General type-2 fuzzy logic systems using shadowed sets:
a new paradigm towards fault-tolerant control. In 2021 Australian &amp; New Zealand Control Conference
(ANZCC), pages 116–121. IEEE, 2021.
[71] Mohammad Hossein Azadi, Khaled Nawaser, Ali Vafaei-Zadeh, Seyed Najmodin Mousavi, Razieh Bagherzadeh Khodashahri, and Haniruzila Hanifah. Investigating antecedents of customer relationship
management using interval type-2 fuzzy fmea approach. International Journal of Business Innovation and
Research, 34(2):139–165, 2024.
131


# Page. 133

![Page Image](https://bcdn.docswell.com/page/LJ3WK1VZJ5.jpg)

Bibliography
[72] Marwan H Hassan, Saad M Darwish, and Saleh M Elkaffas. Type-2 neutrosophic set and their applications
in medical databases deadlock resolution. Computers, Materials &amp; Continua, 74(2), 2023.
[73] Muslem Al-Saidi, Áron Ballagi, Oday Ali Hassen, and Saad M Saad. Type-2 neutrosophic markov chain
model for subject-independent sign language recognition: A new uncertainty–aware soft sensor paradigm.
Sensors (Basel, Switzerland), 24(23):7828, 2024.
[74] Soumen Kumar Das, F Yu Vincent, Sankar Kumar Roy, and Gerhard Wilhelm Weber. Location–allocation
problem for green efficient two-stage vehicle-based logistics system: A type-2 neutrosophic multi-objective
modeling approach. Expert Systems with Applications, 238:122174, 2024.
[75] Khizar Hayat, Muhammad Irfan Ali, Bing yuan Cao, and Xiaopeng Yang. A new type-2 soft set: Type-2
soft graphs and their applications. Adv. Fuzzy Syst., 2017:6162753:1–6162753:17, 2017.
[76] Khizar Hayat, Bing-Yuan Cao, Muhammad Irfan Ali, Faruk Karaaslan, and Zejian Qin. Characterizations
of certain types of type 2 soft graphs. Discrete Dynamics in Nature and Society, 2018(1):8535703, 2018.
[77] Musavarah Sarwar and Muhammad Akram. Certain hybrid rough models with type-2 soft information.
Journal of Multiple-Valued Logic &amp; Soft Computing, 40, 2023.
[78] Shumaila Manzoor, Saima Mustafa, Kanza Gulzar, Asim Gulzar, Sadia Nishat Kazmi, Syed Muhammad Abrar Akber, Rasool Bukhsh, Sheraz Aslam, and Syed Muhammad Mohsin. Multifuzztops: A fuzzy
multi-criteria decision-making model using type-2 soft sets and topsis. Symmetry, 16(6):655, 2024.
[79] Guzide Senel. Soft topology generated by l-soft sets. Journal of New Theory, 24:88–100, 2018.
[80] Arif Mehmood Khattak, Nazia Hanif, Fawad Nadeem, Muhammad Zamir, Choonkil Park, Giorgio Nordo,
and Shamoona Jabeen. Soft b-separation axioms in neutrosophic soft topological structures. Infinite Study,
2019.
[81] Saleem Abdullah, Imran Khan, and Muhammad Aslam. A new approach to soft set through applications
of cubic set. arXiv preprint arXiv:1210.6517, 2012.
[82] Srinivasan Vijayabalaji and Kaliyaperumal Punniyamoorthy. Cubic inverse soft set. In Soft Computing,
pages 87–94. CRC Press, 2023.
[83] G Muhiuddin and Abdullah M Al-roqi. Cubic soft sets with applications in bck/bci-algebras. Annals of
Fuzzy Mathematics and Informatics, 8(2):291–304, 2014.
[84] Fatia Fatimah, Dedi Rosadi, RB Fajriya Hakim, and José Carlos R. Alcantud. Probabilistic soft sets and
dual probabilistic soft sets in decision-making. Neural Computing and Applications, 31:397–407, 2019.
[85] Ping Zhu and Qiaoyan Wen. Probabilistic soft sets. In 2010 IEEE international conference on granular
computing, pages 635–638. IEEE, 2010.
[86] Bindu Nila and Jagannath Roy. Analysis of critical success factors of logistics 4.0 using d-number based
pythagorean fuzzy dematel method. Decision Making Advances, 2(1):92–104, 2024.
[87] Yuzhen Li and Yabin Shao. Fuzzy cognitive maps based on d-number theory. IEEE Access, 10:72702–72716,
2022.
[88] Nuttapong Wattanasiripong, Nuchanat Tiprachot, and Somsak Lekkoksung. On tripolar complex fuzzy
sets and their application in ordered semigroups. International Journal of Analysis and Applications,
23:139–139, 2025.
[89] Songsong Dai. Linguistic complex fuzzy sets. Axioms, 12(4):328, 2023.
[90] Faisal Al-Sharqi, Ashraf Al-Quran, et al. Similarity measures on interval-complex neutrosophic soft sets
with applications to decision making and medical diagnosis under uncertainty. Neutrosophic Sets and
Systems, 51:495–515, 2022.
[91] Said Broumi, Mohamed Talea, Assia Bakali, and Florentin Smarandache. Complex neutrosophic graphs
of type. Collected Papers. Volume VI: On Neutrosophic Theory and Applications, page 204, 2022.
[92] Naveed Yaqoob and Muhammad Akram. Complex neutrosophic graphs. Infinite Study, 2018.
[93] Tahir Mahmood and Ubaid ur Rehman. A novel approach towards bipolar complex fuzzy sets and their
applications in generalized similarity measures. International Journal of Intelligent Systems, 37:535 – 567,
2021.
[94] Daniel Ramot, Menahem Friedman, Gideon Langholz, Ron Milo, and Abraham Kandel. On complex fuzzy
sets. 10th IEEE International Conference on Fuzzy Systems. (Cat. No.01CH37297), 3:1160–1163 vol.2,
2001.
[95] Güzide Şenel. A new construction of spheres via soft real numbers and soft points. Mathematics Letters,
4(3):39–43, 2018.
[96] Sujoy Das and SK Samanta. On soft complex sets and soft complex numbers. J. fuzzy math, 21(1):195–216,
2013.
[97] Sujoy Das and SK Samanta. Soft real sets, soft real numbers and their properties. J. fuzzy Math,
20(3):551–576, 2012.
[98] Seok Zun Song, Hee Sik Kim, and Young Bae Jun. Ideal theory in semigroups based on intersectional soft
sets. The Scientific World Journal, 2014(1):136424, 2014.
132


# Page. 134

![Page Image](https://bcdn.docswell.com/page/8JDK3X84EG.jpg)

Bibliography
[99] Eun Hwan Roh and Young Bae Jun. Positive implicative ideals of bck-algebras based on intersectional
soft sets. Journal of Applied Mathematics, 2013(1):853907, 2013.
[100] G Muhiuddin. Intersectional soft sets theory applied to generalized hypervector spaces. Analele ştiinţifice
ale Universităţii” Ovidius” Constanţa. Seria Matematică, 28(3):171–191, 2020.
[101] Young Bae Jun, Chul Hwan Park, and Noura Omair Alshehri. Hypervector spaces based on intersectional
soft sets. In Abstract and Applied Analysis. Wiley Online Library, 2014.
[102] Hüseyin Kamac and Subramanian Petchimuthu. Bipolar n-soft set theory with applications. Soft Computing, 24:16727 – 16743, 2020.
[103] Muhammad Akram, Arooj Adeel, and José Carlos Rodriguez Alcantud. Group decision-making methods
based on hesitant n-soft sets. Expert Syst. Appl., 115:95–105, 2019.
[104] Fatia Fatimah and José Carlos Rodriguez Alcantud. The multi-fuzzy n-soft set and its applications to
decision-making. Neural Computing and Applications, 33:11437 – 11446, 2021.
[105] Fatia Fatimah, Fatia Fatimah, Dedi Rosadi, R. B. Fajriya Hakim, and José Carlos Rodriguez Alcantud.
N-soft sets and their decision making algorithms. Soft Computing, 22:3829 – 3842, 2017.
[106] Sagvan Y Musa, Ramadhan A Mohammed, and Baravan A Asaad. N-hypersoft sets: An innovative
extension of hypersoft sets and their applications. Symmetry, 15(9):1795, 2023.
[107] Sagvan Y Musa.
N-bipolar hypersoft sets:
19(1):e0296396, 2024.
Enhancing decision-making algorithms.
Plos one,
[108] Orhan Dalkılıç. Unifying relationships in uncertain environments: examining relations in binary soft sets
for expressing inter-object correspondence. The Journal of Supercomputing, 81(16):1–29, 2025.
[109] Ahu Açıkgöz and Nihal Tas. Binary soft set theory. EUROPEAN JOURNAL OF PURE AND APPLIED
MATHEMATICS, 9(4):452–463, 2016.
[110] Muhammad Saqlain, Poom Kumam, and Wiyada Kumam. Multi-criteria decision-making method based
on weighted and geometric aggregate operators of linguistic fuzzy-valued hypersoft set with application.
Journal of Fuzzy Extension and Applications, 6(2):344–370, 2025.
[111] Muhammad Saqlain, Poom Kumam, and Wiyada Kumam. Linguistic hypersoft set with application to
multi-criteria decision-making to enhance rural health services. Neutrosophic Sets and Systems, 61:28–52,
2023.
[112] Srinivasan Vijayabalaji and Adhimoolam Ramesh. Uncertain multiplicative linguistic soft sets and their
application to group decision making. Journal of Intelligent &amp; Fuzzy Systems, 35(3):3883–3893, 2018.
[113] Hongjun Guan, Shuang Guan, and Aiwu Zhao. Intuitionistic fuzzy linguistic soft sets and their application
in multi-attribute decision-making. Journal of Intelligent &amp; Fuzzy Systems, 31(6):2869–2879, 2016.
[114] Zhao Aiwu and Guan Hongjun. Fuzzy-valued linguistic soft set theory and multi-attribute decision-making
application. Chaos, Solitons &amp; Fractals, 89:2–7, 2016.
[115] Zhifu Tao, Huayou Chen, Ligang Zhou, and Jinpei Liu. 2-tuple linguistic soft set and its application to
group decision making. Soft computing, 19:1201–1213, 2015.
[116] NLA Mohd Kamal, Lazim Abdullah, and Ilyani Abdullah. Multi-valued neutrosophic linguistic soft set
and its application in multi-criteria decision-making. Journal of Advanced Research in Dynamical and
Control Systems, 11:12, 2019.
[117] Takaaki Fujita. Metafuzzy, metaneutrosophic, metasoft, and metarough set. 2025.
[118] Takaaki Fujita. Metastructure, meta-hyperstructure, and meta-superhyper structure. Journal of Computers and Applications, 1(1):1–22, 2025.
[119] Asghar Khan, Muhammad Izhar, and Mohammed M Khalaf. Double-framed soft la-semigroups. Journal
of Intelligent &amp; Fuzzy Systems, 33(6):3339–3353, 2017.
[120] Yanbin Liu, Peina Liang, and Jingjie Ma. An empirical study on the quality of industry-linked education
in vocational colleges: Double-framed treesoft set framework. Neutrosophic Sets and Systems, 85(1):32,
2025.
[121] Muhammad Saeed, Hafiz Inam ul Haq, and Mubashir Ali. Extension of double frame soft set to double
frame hypersoft set (dfss to dfhss). HyperSoft Set Methods in Engineering, 2:18–27, 2024.
[122] Muhammad Izhar, Tariq Mahmood, Asghar Khan, Muhammad Farooq, and Kostaq Hila. Double-framed
soft set theory applied to abel-grassmann’s hypergroupoids. New Mathematics and Natural Computation,
18(03):819–841, 2022.
[123] Muhammad Saeed, Muhammad Rayees Ahmad, Muhammad Saqlain, and Muhammad Riaz. Rudiments
of n-framed soft sets. Punjab University Journal of Mathematics, 52(5), 2020.
[124] Muhammad Rayees Ahmad, Usman Afzal, Nadir Omer, Ali Delham Algarni, Sara A Ghorashi, and Huda
Eltayeb. A computational diagnostic model for infectious diseases via similarity measures on n-framed
plithogenic hypersoft sets. Alexandria Engineering Journal, 127:1209–1219, 2025.
[125] Usman Afzal, Muhammad Rayees Ahmad, Nazek Alessa, Nauman Raza, Fathea MO Birkea, Salem Alkhalaf, and Nader Omer. Intelligent faculty evaluation and ranking system based on n-framed plithogenic fuzzy
hypersoft set and extended nr-topsis. Alexandria Engineering Journal, 109:18–28, 2024.
133


# Page. 135

![Page Image](https://bcdn.docswell.com/page/VEPK4P8V78.jpg)

Bibliography
[126] Ajoy Kanti Das, Florentin Smarandache, Rakhal Das, and Suman Das. A comprehensive study on decisionmaking algorithms in retail and project management using double framed hypersoft sets. HyperSoft Set
Methods in Engineering, 2:62–71, 2024.
[127] Minyan Chen. Double framed hypersoft set for studies factors that influence and ways to improve vocational
college instruction in innovation and entrepreneurship. Neutrosophic Sets and Systems, 85(1):5, 2025.
[128] Lingling Chen. Measuring teaching success in college foreign literature programs: An evaluation perspective
using double framed superhypersoft set. Neutrosophic Sets and Systems, 85(1):7, 2025.
[129] Takaaki Fujita. Double-framed superhypersoft set and double-framed treesoft set. Advancing Uncertain Combinatorics through Graphization, Hyperization, and Uncertainization: Fuzzy, Neutrosophic, Soft,
Rough, and Beyond, page 71, 2025.
[130] Ke Gong, Panpan Wang, and Zhi Xiao. Bijective soft set decision system based parameters reduction
under fuzzy environments. Applied Mathematical Modelling, 37(6):4474–4485, 2013.
[131] Ke Gong, Zhi Xiao, and Xia Zhang. The bijective soft set with its operations. Comput. Math. Appl.,
60:2270–2278, 2010.
[132] Varun Kumar Tiwari, Prashant Kumar Jain, and Puneet Tandon. An integrated shannon entropy and topsis for product design concept evaluation based on bijective soft set. Journal of Intelligent Manufacturing,
30:1645 – 1658, 2017.
[133] Atiqe Ur Rahman, Muhammad Saeed, and Abida Hafeez. Theory of bijective hypersoft set with application
in decision making. Punjab University Journal of Mathematics, 53(7), 2021.
[134] Takaaki Fujita. N-superhypersoft set and bijective superhypersoft set. Advancing Uncertain Combinatorics
through Graphization, Hyperization, and Uncertainization: Fuzzy, Neutrosophic, Soft, Rough, and Beyond,
page 138, 2025.
[135] Muhammad Ihsan, Muhammad Saeed, Atiqe Ur Rahman, and Florentin Smarandache. Multi-attribute
decision support model based on bijective hypersoft expert set. Punjab University Journal of Mathematics,
54(1), 2022.
[136] Gustavo Santos-García and José Carlos R Alcantud. Ranked soft sets. Expert Systems, 40(6):e13231, 2023.
[137] Irfan Deli. Refined neutrosophic sets and refined neutrosophic soft sets: theory and applications. In
Handbook of research on generalized and hybrid set structures and applications for soft computing, pages
321–343. IGI Global, 2016.
[138] Takaaki Fujita and Florentin Smarandache. Some types of hyperneutrosophic set (6): Multineutrosophic
set and refined neutrosophic set. Infinite Study, 2025.
[139] Florentin Smarandache. n-valued refined neutrosophic logic and its applications to physics. Infinite study,
4:143–146, 2013.
[140] Anjan Mukherjee, Mithun Datta, and Abhijit Saha. Refined soft sets and its applications. Journal of New
Theory, 14:10–25, 2016.
[141] Faruk Karaaslan. Correlation coefficients of single-valued neutrosophic refined soft sets and their applications in clustering analysis. Neural Computing and Applications, 28(9):2781–2793, 2017.
[142] R Anitha Cruz. Neutrosophic soft cubic refined sets. Neutrosophic Sets &amp; Systems, 73, 2024.
[143] Fujita Takaaki and Arif Mehmood. Iterative multifuzzy set, iterative multineutrosophic set, iterative
multisoft set, and multiplithogenic sets. Neutrosophic Computing and Machine Learning, 41:1–30, 2025.
[144] Shawkat Alkhazaleh, Abdul Razak Salleh, Nasruddin Hassan, and Abd Ghafur Ahmad. Multisoft sets. In
Proc. 2nd International Conference on Mathematical Sciences, pages 910–917, 2010.
[145] Florentin Smarandache. Practical applications of IndetermSoft Set and IndetermHyperSoft Set and introduction to TreeSoft Set as an extension of the MultiSoft Set. Infinite Study, 2022.
[146] Sabu Sebastian and TV Ramakrishnan. Multi-fuzzy sets: An extension of fuzzy sets. Fuzzy Information
and Engineering, 3:35–43, 2011.
[147] Mahalakshmi Pethaperumal, Vimala Jeyakumar, Jeevitha Kannan, and Ashma Banu. An algebraic analysis on exploring q-rung orthopair multi-fuzzy sets. Journal of fuzzy extension and applications, 4(3):235–245,
2023.
[148] Johanna Estefanía Street Imbaquingo Street, Karen Milagros Díaz Street Salambay, Jordy Alexis Vargas
Yumbo, and Kevin Christopher Carrasco Azogue. Intercultural education from ancestral logic: Application of the ayni method multi-neutrosophic to strengthen mixed methodology in the waorani community.
Neutrosophic Sets and Systems, 92:263–283, 2025.
[149] Ennio Jesús Mérida Córdova, Elizabeth Esther Vergel Parejo, and Raúl López Fernández. Scholarai scholarly article search strategies with the ayni method multi-neutrosophic for ethical information management
in ai. Neutrosophic Sets and Systems, 92:380–397, 2025.
[150] Takaaki Fujita and Florentin Smarandache. An introduction to advanced soft set variants: Superhypersoft
sets, indetermsuperhypersoft sets, indetermtreesoft sets, bihypersoft sets, graphicsoft sets, and beyond.
Neutrosophic Sets and Systems, 82:817–843, 2025.
[151] Ari M Lipsky and Sander Greenland. Causal directed acyclic graphs. JAMA, 327(11):1083–1084, 2022.
134


# Page. 136

![Page Image](https://bcdn.docswell.com/page/27VVX2NR7Q.jpg)

Bibliography
[152] Takaaki Fujita. Directed acyclic superhypergraphs (dash): A general framework for hierarchical dependency modeling. Neutrosophic Knowledge, 6:72–86, 2025.
[153] Peter WG Tennant, Eleanor J Murray, Kellyn F Arnold, Laurie Berrie, Matthew P Fox, Sarah C Gadd,
Wendy J Harrison, Claire Keeble, Lynsie R Ranker, Johannes Textor, et al. Use of directed acyclic graphs
(dags) to identify confounders in applied health research: review and recommendations. International
journal of epidemiology, 50(2):620–632, 2021.
[154] Weizhou Shen, Siyue Wu, Yunyi Yang, and Xiaojun Quan. Directed acyclic graph network for conversational emotion recognition. arXiv preprint arXiv:2105.12907, 2021.
[155] Takaaki Fujita and Florentin Smarandache. An introduction to advanced soft set variants: Superhypersoft
sets, indetermsuperhypersoft sets, indetermtreesoft sets, bihypersoft sets, graphicsoft sets, and beyond.
Neutrosophic Sets and Systems, 82:817–843, 2025.
[156] Mehmet Şahin, İrfan Deli, and Vakkas Uluçay. Bipolar Neutrosophic Soft Expert Sets. Infinite Study, 2016.
[157] Faisal Al-Sharqi, Abd Ghafur Ahmad, and Ashraf Al-Quran. Interval-valued neutrosophic soft expert set
from real space to complex space. CMES-Computer Modeling in Engineering &amp; Sciences, 132(1), 2022.
[158] Ashraf Al-Quran and Nasruddin Hassan. The complex neutrosophic soft expert set and its application in
decision making. Journal of Intelligent &amp; Fuzzy Systems, 34(1):569–582, 2018.
[159] Ashraf Al-Quran, Nasruddin Hassan, and Shawkat Alkhazaleh. Fuzzy parameterized complex neutrosophic
soft expert set for decision under uncertainty. Symmetry, 11(3):382, 2019.
[160] Fathima Perveen PA, Sunil Jacob John, et al. On spherical fuzzy soft expert sets. In AIP conference
proceedings. AIP Publishing, 2020.
[161] Yousef Al-Qudah and Nasruddin Hassan. Fuzzy parameterized complex multi-fuzzy soft expert sets. THE
2018 UKM FST POSTGRADUATE COLLOQUIUM: Proceedings of the Universiti Kebangsaan Malaysia,
Faculty of Science and Technology 2018 Postgraduate Colloquium, 2019.
[162] Mehmet Sahin, Shawkat Alkhazaleh, and Vakkas Ulucay. Neutrosophic soft expert sets.
Mathematics-a Journal of Chinese Universities Series B, 06:116–127, 2015.
Applied
[163] Faisal Al-Sharqi, Yousef Al-Qudah, and Naif Alotaibi. Decision-making techniques based on similarity
measures of possibility neutrosophic soft expert sets. Neutrosophic Sets and Systems, vol. 55/2023: An
International Journal in Information Science and Engineering, page 358, 2024.
[164] Sumyyah Al-Hijjawi, Abd Ghafur Ahmad, and Shawkat Alkhazaleh. Effective neutrosophic soft expert set
and its application. International Journal of Neutrosophic Science (IJNS), 23(1), 2024.
[165] Takaaki Fujita. Superhypersoft rough set, superhypersoft expert set, and bipolar superhypersoft set.
Advancing Uncertain Combinatorics through Graphization, Hyperization, and Uncertainization: Fuzzy,
Neutrosophic, Soft, Rough, and Beyond, page 270, 2025.
[166] Ashraf Al-Quran, Nasruddin Hassan, and Emad A. Marei. A novel approach to neutrosophic soft rough
set under uncertainty. Symmetry, 11:384, 2019.
[167] Xinyi Wang and Qinghai Wang. Uncertainty measurement of variable precision fuzzy soft rough set model.
In CECNet, 2022.
[168] Tasawar Abbas, Rehan Zafar, Sana Anjum, Ambreen Ayub, and Zamir Hussain. An innovative soft rough
dual hesitant fuzzy sets and dual hesitant fuzzy soft rough sets. VFAST Transactions on Mathematics,
2023.
[169] Fu Zhang, Weimin Ma, and Hongwei Ma. Dynamic chaotic multi-attribute group decision making under
weighted t-spherical fuzzy soft rough sets. Symmetry, 15:307, 2023.
[170] Aysun Benek and Taha Yasin Ozturk. A comparative analysis of two different decision-making methods
in neutrosophic soft rough set environments. OPSEARCH, pages 1–22, 2025.
[171] Jingjing Zhang. Neutrosophic soft rough sets for quality evaluation of interactive music teaching in higher
education: A novel approach. Neutrosophic Sets and Systems, 90(1):66, 2025.
[172] Siyang Yang. Extending superhypersoft framework: Weighted soft sets for priority-based decision-making
in engineering ethics risk analysis based on big data technology. Neutrosophic Sets and Systems, 86:119–125,
2025.
[173] K Selvakumari. Solving game problem using weighted soft sets. Journal of Computer and Mathematical
Sciences, 9(10):1307–1311, 2018.
[174] Holy-Heavy M Balami, Aliyu G Dzarma, and Mohammed A Mohammed. Weighted soft set and its
application in parameterized decision making processes. International Journal of Development Mathematics
(IJDM), 2(1):131–144, 2025.
[175] Omer Akguller. Geometric soft sets. Hittite Journal of Science and Engineering, 4(2):159–164, 2017.
[176] Abdul Razak Salleh, Shawkat Alkhazaleh, Nasruddin Hassan, and Abd Ghafur Ahmad. Multiparameterized soft set. Journal of Mathematics and Statistics, 8(1):92–97, 2012.
[177] Takaaki Fujita and Iqbal M Batiha. Multiparameterized hypersoft set and type-2 hypersoft set. Neutrosophic Sets and Systems, 95:183–199, 2026.
135


# Page. 137

![Page Image](https://bcdn.docswell.com/page/5JGLVRK67L.jpg)

Bibliography
[178] Young Bae Jun, Seok Zun Song, and G Muhiuddin. Concave soft sets, critical soft points, and union-soft
ideals of ordered semigroups. The Scientific World Journal, 2014(1):467968, 2014.
[179] Atiqe Ur Rahman, Muhammad Saeed, and Florentin Smarandache. Convex and concave hypersoft sets
with some properties, volume 38. Infinite Study, 2020.
[180] İrfan Deli. Convex and concave sets based on soft sets and fuzzy soft sets. Journal of New Theory,
29:101–110, 2019.
[181] P. A. Fathima Perveen and Sunil Jacob John. Relations on spherical fuzzy soft sets. 2nd INTERNATIONAL
CONFERENCE ON COMPUTATIONAL SCIENCES-MODELLING, COMPUTING AND SOFT COMPUTING (CSMCS 2022), 2023.
[182] Sujit Das and Samarjit Kar. Intuitionistic multi fuzzy soft set and its application in decision making. In
Pattern Recognition and Machine Intelligence: 5th International Conference, PReMI 2013, Kolkata, India,
December 10-14, 2013. Proceedings 5, pages 587–592. Springer, 2013.
[183] Muhammad Saeed, Irfan Saif Ud Din, Imtiaz Tariq, and Harish Garg. Refined fuzzy soft sets: Properties,
set-theoretic operations and axiomatic results. Journal of Computational and Cognitive Engineering,
3(1):24–33, 2024.
[184] Sheikh Zain Majid, Muhammad Saeed, Umar Ishtiaq, and Ioannis K Argyros. The development of a hybrid
model for dam site selection using a fuzzy hypersoft set and a plithogenic multipolar fuzzy hypersoft set.
Foundations, 4(1):32–46, 2024.
[185] Xingsi Xue, Himanshu Dhumras, Garima Thakur, Rakesh Kumar Bajaj, and Varun Shukla. Schweizersklar t-norm operators for picture fuzzy hypersoft sets: Advancing suistainable technology in social healthy
environments. Computers, Materials &amp; Continua, 84(1), 2025.
[186] R Hema, R Sudharani, and M Kavitha. A novel approach on plithogenic interval valued neutrosophic
hypersoft sets and its application in decision making. Indian Journal Of Science And Technology, 2023.
[187] Takaaki Fujita. Hyperfuzzy hypersoft set and hyperneutrosophic hypersoft set. Advancing Uncertain Combinatorics through Graphization, Hyperization, and Uncertainization: Fuzzy, Neutrosophic, Soft, Rough,
and Beyond, page 247, 2025.
[188] Francina Shalini. Trigonometric similarity measures of pythagorean neutrosophic hypersoft sets. Neutrosophic Systems with Applications, 2023.
[189] Muhammad Saqlain and Xiao Long Xin. Interval valued, m-polar and m-polar interval valued neutrosophic
hypersoft sets. Infinite Study, 2020.
[190] Yuncheng Jiang, Yong Tang, Qimai Chen, Hai Liu, and Jianchao Tang. Interval-valued intuitionistic fuzzy
soft sets and their properties. Computers &amp; Mathematics with Applications, 60(3):906–918, 2010.
[191] Harish Garg and Rishu Arora. Bonferroni mean aggregation operators under intuitionistic fuzzy soft
set environment and their applications to decision-making. Journal of the Operational Research Society,
69:1711 – 1724, 2018.
[192] Harish Garg and Rishu Arora. Topsis method based on correlation coefficient for solving decision-making
problems with intuitionistic fuzzy soft set information. In AIMS mathematics, 2020.
[193] Harish Garg and Rishu Arora. Generalized maclaurin symmetric mean aggregation operators based on
archimedean t-norm of the intuitionistic fuzzy soft set information. Artificial Intelligence Review, 54:3173
– 3213, 2020.
[194] Krassimir T Atanassov and G Gargov. Intuitionistic fuzzy logics. Springer, 2017.
[195] Shawkat Alkhazaleh. n-valued refined neutrosophic soft set theory. Journal of Intelligent &amp; Fuzzy Systems,
32(6):4311–4318, 2017.
[196] Shawkat Alkhazaleh and Ayman A Hazaymeh. N-valued refined neutrosophic soft sets and their applications in decision making problems and medical diagnosis. Journal of Artificial Intelligence and Soft
Computing Research, 8(1):79–86, 2018.
[197] Muhammad Akram and Sundas Shahzadi. Representation of graphs using intuitionistic neutrosophic soft
sets. Infinite Study, 2016.
[198] S Broumi and Tomasz Witczak. Heptapartitioned neutrosophic soft set. International Journal of Neutrosophic Science, 18(4):270–290, 2022.
[199] Quang-Thinh Bui, My-Phuong Ngo, Vaclav Snasel, Witold Pedrycz, and Bay Vo. The sequence of neutrosophic soft sets and a decision-making problem in medical diagnosis. International Journal of Fuzzy
Systems, 24:2036 – 2053, 2022.
[200] Hüseyin Kamacı. Linguistic single-valued neutrosophic soft sets with applications in game theory. International Journal of Intelligent Systems, 36(8):3917–3960, 2021.
[201] S. Onar. A note on neutrosophic soft set over hyperalgebras. Symmetry, 16(10):1288, 2024.
[202] Faruk Karaaslan. Neutrosophic soft sets with applications in decision making. Infinite Study, 2014.
[203] Pabitra Kumar Maji. Neutrosophic soft set. Infinite Study, 2013.
136


# Page. 138

![Page Image](https://bcdn.docswell.com/page/47QY6VN2EP.jpg)

Bibliography
[204] Fazeelat Sultana, Muhammad Gulistan, Mumtaz Ali, Naveed Yaqoob, Muhammad Khan, Tabasam Rashid,
and Tauseef Ahmed. A study of plithogenic graphs: applications in spreading coronavirus disease (covid-19)
globally. Journal of ambient intelligence and humanized computing, 14(10):13139–13159, 2023.
[205] Nivetha Martin. Introduction to possibility plithogenic soft sets. Plithogenic Logic and Computation, 2024.
[206] Shawkat Alkhazaleh. Plithogenic soft set. Infinite Study, 2020.
[207] Takaaki Fujita and Florentin Smarandache. A unified framework for u-structures and functorial structure:
Managing super, hyper, superhyper, tree, and forest uncertain over/under/off models. Neutrosophic Sets
and Systems, 91:337–380, 2025.
[208] Takaaki Fujita and Florentin Smarandache. HyperGraph and SuperHyperGraph Theory with Applications
(IV): Uncertain Graph Theory, volume IV of HyperGraph and SuperHyperGraph Theory with Applications.
Neutrosophic Science International Association (NSIA) Publishing House, 1.0 edition, 2026.
[209] Takaaki Fujita and Florentin Smarandache. HyperGraph and SuperHyperGraph Theory with Applications.
Neutrosophic Science International Association (NSIA) Publishing House, 2026.
[210] YS Yun. Parametric operations between 3-dimensional triangular fuzzy number and trapezoidal fuzzy set.
Journal of Algebra &amp; Applied Mathematics, 21(2), 2023.
[211] Tong Shaocheng. Interval number and fuzzy number linear programmings. Fuzzy sets and systems,
66(3):301–306, 1994.
[212] Takaaki Fujita and Florentin Smarandache. A Dynamic Survey of Fuzzy, Intuitionistic Fuzzy, Neutrosophic,
Plithogenic, and Extensional Sets. Neutrosophic Science International Association (NSIA), 2025.
[213] Jyoti D Thenge, B Surendranath Reddy, and Rupali S Jain. Contribution to soft graph and soft tree. New
Mathematics and Natural Computation, 15(01):129–143, 2019.
[214] Muhammad Akram and Saira Nawaz. Operations on soft graphs. Fuzzy information and Engineering,
7(4):423–449, 2015.
[215] Muhammad Saeed, Muhammad Khubab Siddique, Muhammad Ahsan, Muhammad Rayees Ahmad, and
Atiqe Ur Rahman. A novel approach to the rudiments of hypersoft graphs. Theory and Application of
Hypersoft Set, Pons Publication House, Brussel, pages 203–214, 2021.
[216] Muhammad Saeed, Atiqe Ur Rahman, and Muhammad Arshad. A study on some operations and products
of neutrosophic hypersoft graphs. Journal of Applied Mathematics and Computing, 68(4):2187–2214, 2022.
[217] Muhammad Saeed, Muhammad Imran Harl, Muhammad Haris Saeed, and Ibrahim Mekawy. Theoretical
framework for a decision support system for micro-enterprise supermarket investment risk assessment using
novel picture fuzzy hypersoft graph. Plos one, 18(3):e0273642, 2023.
[218] R. Jahir Hussain and M. S. Afya Farhana. Fuzzy chromatic number of fuzzy soft cycle and complete fuzzy
soft graphs. AIP Conference Proceedings, 2023.
[219] Umair Amin, Aliya Fahmi, Yaqoob Naveed, Aqsa Farid, and Muhammad Arshad Shehzad Hassan. Domination in bipolar fuzzy soft graphs. J. Intell. Fuzzy Syst., 46:6369–6382, 2024.
[220] Vakkas Ulucay. Q-neutrosophic soft graphs in operations management and communication network. Soft
Computing, 25:8441 – 8459, 2021.
[221] S Satham Hussain, R Hussain, and Florentin Smarandache. Domination number in neutrosophic soft
graphs. Neutrosophic Sets and Systems, 28:228–244, 2019.
[222] Muhammad Akram and Hafiza Saba Nawaz. Implementation of single-valued neutrosophic soft hypergraphs on human nervous system. Artificial Intelligence Review, 56(2):1387–1425, 2023.
[223] Bobin George, Jinta Jose, and Rajesh K Thumbakara. Exploring soft hypergraphs through various operations. New Mathematics and Natural Computation, 20(02):551–566, 2024.
[224] Takaaki Fujita, Atiqe Ur Rahman, Arkan A Ghaib, Talal Ali Al-Hawary, and Arif Mehmood Khattak. On
the properties and illustrative examples of soft superhypergraphs and rough superhypergraphs. Prospects
for Applied Mathematics and Data Analysis, 5(1):12–31, 2025.
[225] Jinta Jose, Bobin George, and Rajesh K Thumbakara. Advancements in soft directed graph theory: new
ideas and properties. New Mathematics and Natural Computation, pages 1–17, 2024.
[226] Jinta Jose, Bobin George, and Rajesh K Thumbakara. Soft directed graphs, their vertex degrees, associated
matrices and some product operations. New Mathematics and Natural Computation, 19(03):651–686, 2023.
[227] Raed Hatamleh, Nasir Odat, Hamza Ali Abujabal, Faria Khan, Arif Mehmood Khattak, Alaa M. Abd
El-latif, Husham M. Attaalfadeel, and Abdelhalim Hasnaoui. Fermatean double-valued neutrosophic soft
topological spaces. European Journal of Pure and Applied Mathematics, 2025.
[228] Maha Mohammed Saeed, Sami Ullah Khan, Fatima Suriyya, Arif Mehmood, and Jamil J Hamja. Intervalvalued complex neutrosophic sets and complex neutrosophic soft topological spaces. International Journal
of Analysis and Applications, 23:132–132, 2025.
[229] V Subash and M Angayarkanni. Neutrosophic hypersoft topological spaces via m-open sets. JP Journal
of Geometry and Topology, 31(1):39–54, 2025.
[230] V Subash and M Angayarkanni. Contra m-continuous maps and contra m-irresolute maps in fuzzy hypersoft
topological spaces. International Journal of Environmental Sciences, 11(6s):431–448, 2025.
137


# Page. 139

![Page Image](https://bcdn.docswell.com/page/KE4W4MGPJ1.jpg)

Bibliography
[231] Sagvan Younis Musa and Baravan Abdulmuhsen Asaad. Connectedness on bipolar hypersoft topological
spaces. Journal of Intelligent &amp; Fuzzy Systems, 43(4):4095–4105, 2022.
[232] PG Patil, C Jaya Subba Reddy, Rani Teli, and Vyshakha Elluru. New structures in fuzzy binary soft
topological spaces. International Journal of Mathematics Trends and Technology-IJMTT, 71, 2025.
[233] Rui Gao and Jianrong Wu. Filter with its applications in fuzzy soft topological spaces. AIMS Mathematics,
6(3):2359–2368, 2021.
[234] A Mukherjee and AK Das. Parameterized topological space induced by an intuitionistic fuzzy soft multi
topological space. Ann. Pure and Applied Math, 7:7–12, 2014.
[235] Francisco Gallego Lupiáñez. On intuitionistic fuzzy topological spaces. Kybernetes, 35(5):743–747, 2006.
[236] M Parimala, M Karthika, and Florentin Smarandache. A review of fuzzy soft topological spaces, intuitionistic fuzzy soft topological spaces and neutrosophic soft topological spaces. Infinite Study, 2020.
[237] Maha Mohammed Saeed, Raed Hatamleh Hatamleh, Alaa M Abd El-latif, Abdallah Al-Husban, Takaaki
Fujita, Cris L Armada, Rabia Andleeb, and Arif Mehmood Khattak. Separation axioms in quadri-partition
neutrosophic soft topological spaces. European Journal of Pure and Applied Mathematics, 18(3):6324–6324,
2025.
[238] S Kumar, A Mary, and R Radha. Penta partitioned neutrosophic soft topological space. Fuzzy, Intuitionistic and Neutrosophic Set Theories and their Applications in Decision Analysis, pages 49–59, 2025.
[239] Noori F Al-Mayahi. Soft banach algebra: Theory and applications. Journal of Iraqi Al-Khawarizmi,
8(2):44–68, 2024.
[240] Young Bae Jun. Union-soft sets with applications in bck/bci-algebras. Bulletin of the Korean Mathematical
Society, 50(6):1937–1956, 2013.
[241] Zanyar A Ameen, Tareq M Al-shami, Radwan Abu-Gdairi, and Abdelwaheb Mhemdi. The relationship
between ordinary and soft algebras with an application. Mathematics, 11(9):2035, 2023.
[242] Nenad Stojanović. Soft sets whose soft measure is zero. Filomat, 39(17):5825–5832, 2025.
[243] Vakkas Ulucay, Mehmet Sahin, Necati Olgun, and Adem Klcman. On neutrosophic soft lattices. Afrika
Matematika, 28:379–388, 2017.
[244] S. Rajareega, J. Felicita Vimala, and D. Preethi. Complex intuitionistic fuzzy soft lattice ordered group
and its weighted distance measures. Mathematics, 2020.
[245] VD Jobish, KV Babitha, and Sunil Jacob John. On soft lattice operations. J Adv Res Pure Math,
5(2):71–86, 2013.
[246] Yingchao Shao and Keyun Qin. Fuzzy soft sets and fuzzy soft lattices. International Journal of Computational Intelligence Systems, 5(6):1135–1147, 2012.
[247] Vassilios Petridis and Vassilis G Kaburlasos. Learning in the framework of fuzzy lattices. IEEE Transactions
on Fuzzy Systems, 7(4):422–440, 2002.
[248] Shio Gai Quek, Ganeshsree Selvachandran, Vimala Jayakumar, Phet Duong, and Le Hoang Son. A new
decision making model based on complex intuitionistic fuzzy soft lattice for traffic monitoring in the
pandemic scenarios. Advanced Intelligent Systems, 6(11):2400145, 2024.
[249] S Rajareega, J Vimala, and D Preethi. Complex intuitionistic fuzzy soft lattice ordered group and its
weighted distance measures. Mathematics, 8(5):705, 2020.
[250] S Rajareega and J Vimala. Operations on complex intuitionistic fuzzy soft lattice ordered group and cifscopras method for equipment selection process. Journal of Intelligent &amp; Fuzzy Systems, 41(5):5709–5718,
2021.
[251] A Sezgin Sezer and AO Atagün. A new kind of vector space: soft vector space. Southeast asian bulletin
of mathematics, 40(5):753–770, 2016.
[252] C Gunduz Aras, AYSE Sonmez, and HUSEYIN Cakalli. An approach to soft functions. J. Math. Anal,
8(2):129–138, 2017.
[253] Sabir Hussain. On some soft functions. Mathematical Sciences Letters, 4(1):55, 2015.
[254] Zanyar A Ameen and Mesfer H Alqahtani. Some classes of soft functions defined by soft open sets modulo
soft sets of the first category. Mathematics, 11(20):4368, 2023.
[255] Hacı Aktaş and Naim Çağman. Soft sets and soft groups. Information sciences, 177(13):2726–2735, 2007.
[256] Ajoy Kanti Das and Carlos Granados. An advanced approach to fuzzy soft group decision-making using
weighted average ratings. SN Computer Science, 2(6):471, 2021.
[257] Muhammad Saeed, Atiqe Ur Rahman, Muhammad Ahsan, and Florentin Smarandache. An inclusive study
on fundamentals of hypersoft set. Theory and Application of Hypersoft Set, 1:1–23, 2021.
[258] Abdülkadir Aygünolu and Halis Aygün. Introduction to fuzzy soft groups. Computers &amp; Mathematics with
Applications, 58(6):1279–1286, 2009.
[259] Majdoleen Abu Qamar and Nasruddin Hassan. Characterizations of group theory under Q-neutrosophic
soft environment. Infinite Study, 2019.
138


# Page. 140

![Page Image](https://bcdn.docswell.com/page/L71Y48DXJG.jpg)

Bibliography
[260] Yıldıray Celik, Canan Ekiz, and Sultan Yamak. Applications of fuzzy soft sets in ring theory. Annals Fuzzy
Mathematics and Informatics, 5(3):451–462, 2013.
[261] Jayanta Ghosh, Dhananjoy Mandal, and T Samanta. Soft structures of groups and rings. International
Journal of Scientific World, 5(2):117–125, 2017.
[262] Ummahan Acar, Fatih Koyuncu, and Bekir Tanay. Soft sets and soft rings. Computers &amp; Mathematics
with Applications, 59(11):3458–3463, 2010.
[263] Roy Goetschel and William Voxman. Fuzzy matroids. In Fuzzy sets and systems, 1988.
[264] Roy Goetschel and William Voxman. Bases of fuzzy matroids. Fuzzy Sets and Systems, 31:253–261, 1989.
[265] Ladislav A. Novak. On goetschel and voxman fuzzy matroids. Fuzzy Sets Syst., 117:407–412, 2001.
[266] Kholod M Hassan and Saied A Johnny. Matroidal structure based on soft-sets. In Journal of Physics:
Conference Series. IOP Publishing, 2020.
[267] Muhammad Akram, Musavarah Sarwar, and Wieslaw A Dudek. Bipolar fuzzy circuits. In Graphs for the
Analysis of Bipolar Fuzzy Information, pages 281–307. Springer, 2020.
[268] Ahmed B AL-Nafee, Said Broumi, and Florentin Smarandache. Neutrosophic soft bitopological spaces.
Infinite Study, 2021.
[269] A Kandil, OAE Tantawy, SA El-Sheikh, and Shawqi A Hazza. Pairwise open (closed) soft sets in soft
bitopological spaces. Ann. Fuzzy Math. Inform, 11(4):571–588, 2016.
[270] Basavaraj M Ittanagi.
107(7):1–4, 2014.
Soft bitopological spaces.
International Journal of Computer Applications,
[271] AF Sayed. Some separation axioms in fuzzy soft bitopological spaces. J. Math. Comput. Sci., 8(1):28–45,
2017.
[272] Taha Yasin Ozturk and Sadi Bayramov. Category of chain complexes of soft modules. International
Mathematical Forum, 7(20):981–992, 2012.
[273] Mohammed Amare Mohammed, Berehanu Bekele Belayneh, Zelalem Teshome Wale, Gezahagne Mulat
Addis, and Mohammed Tesemma. Construction of soft modules over soft abelian groups. Research in
Mathematics, 13(1):2605729, 2026.
[274] Mikail Bal and Necati Olgun. Soft neutrosophic modules. Mathematics, 6(12):323, 2018.
[275] Qiu-Mei Sun, Zi-Long Zhang, and Jing Liu. Soft sets and soft modules. In International Conference on
Rough Sets and Knowledge Technology, pages 403–409. Springer, 2008.
[276] Sadi Bayramov, Cigdem Gunduz, and M Ibrahim Yazar. Inverse system of fuzzy soft modules. Annals of
Fuzzy Mathematics and Informatics, 4(2):349–363, 2012.
[277] OA Tantawy and RM Hassan. Soft metric spaces. In 5th International Conference on Mathematics and
Information Sciences, 2016.
[278] İsmet Altıntaş and Peyil Esengul kyzy.
29(19):5613–5623, 2025.
Topology of soft partial metric spaces.
Soft Computing,
[279] Vildan Çetkin, Elif Güner, and Halis Aygün. On 2s-metric spaces. Soft Computing, 24(17):12731–12742,
2020.
[280] Sonam, Ramakant Bhardwaj, Josika Mal, Pulak Konar, and Phumin Sumalai. Fixed point results in soft
probabilistic metric spaces. The journal of Analysis, 33(1):139–166, 2025.
[281] Yuan Zou. Bayesian decision making under soft probabilities. Journal of Intelligent &amp; Fuzzy Systems,
44(6):10661–10673, 2023.
[282] DA Molodtsov. Soft probability of large deviations. Advances in Systems Science and Applications,
13(1):53–67, 2013.
[283] Jing Qiu, Zhi Xiao, Wei Xu, and Ying Zhou. Soft probability based random forest for financial distress
prediction. Information Sciences, page 122870, 2025.
[284] Trevor Jack. On the complexity of properties of partial bijection semigroups, 2021.
[285] Rukchart Prasertpong and Aiyared Iampan. Approximation approaches for rough hypersoft sets based on
hesitant bipolar-valued fuzzy hypersoft relations on semigroups. Journal of Mathematics and Computer
Science, 2022.
[286] Young Bae Jun, Kyoung Ja Lee, and Asghar Khan. Soft ordered semigroups. Mathematical Logic Quarterly,
56(1):42–50, 2010.
[287] Tahir Mahmood, Muhammad Asif, Ubaid ur Rehman, and Jabbar Ahmmad. T-bipolar soft semigroups
and related results. Spectrum of Mechanical Engineering and Operational Research, 1(1):258–271, 2024.
[288] Munazza Naz, Muhammad Shabir, and Muhammad Irfan Ali. On fuzzy soft semigroups. World Applied
Sciences Journal (Special Issue of Applied Math), 22:62–83, 2013.
[289] Cheng-Fu Yang. Fuzzy soft semigroups and fuzzy soft ideals. Computers &amp; Mathematics with Applications,
61(2):255–261, 2011.
[290] M Al Tahan and Bijan Davvaz. Weak chemical hyperstructures associated to electrochemical cells. Iranian
Journal of Mathematical Chemistry, 9(1):65–75, 2018.
139


# Page. 141

![Page Image](https://bcdn.docswell.com/page/G7WGXZYKE2.jpg)

Bibliography
[291] Maria Santilli Ruggero and Thomas Vougiouklis. Hyperstructures in lie-santilli admissibility and isotheories. Ratio Mathematica, 33:151, 2017.
[292] Florentin Smarandache. Foundation of superhyperstructure &amp; neutrosophic superhyperstructure. Neutrosophic Sets and Systems, 63(1):21, 2024.
[293] Sultan Yamak, Osman Kazancı, and Bijan Davvaz. Soft hyperstructure. Computers &amp; Mathematics with
Applications, 62(2):797–803, 2011.
[294] Gulay Oguz and Bijan Davvaz. Soft topological hyperstructure. J. Intell. Fuzzy Syst., 40:8755–8764, 2021.
[295] GR Amiri, R Mousarezaei, and S Rahnama. Soft hyperstructures and their applications. New Mathematics
and Natural Computation, pages 1–19, 2024.
[296] Takaaki Fujita and Florentin Smarandache. Superhypergraph neural networks and plithogenic graph neural
networks: Theoretical foundations. Infinite Study, 2025.
[297] A Meenakshi, J Shivangi Mishra, Jeong Gon Lee, Antonios Kalampakas, and Sovan Samanta. Advanced
risk prediction in healthcare: Neutrosophic graph neural networks for disease transmission. Complex &amp;
Intelligent Systems, 11(9):413, 2025.
[298] Filip Ekström Kelvinius, Dimitar Georgiev, Artur Toshev, and Johannes Gasteiger. Accelerating molecular
graph neural networks via knowledge distillation. Advances in Neural Information Processing Systems,
36:25761–25792, 2023.
[299] Daniel Vik, David Pii, Chirag Mudaliar, Mads Nørregaard-Madsen, and Aleksejs Kontijevskis. Performance and robustness of small molecule retention time prediction with molecular graph neural networks
in industrial drug discovery campaigns. Scientific Reports, 14(1):8733, 2024.
[300] Yingfang Yuan, Wenjun Wang, Xin Li, Kefan Chen, Yonghan Zhang, and Wei Pang. Evolving molecular
graph neural networks with hierarchical evaluation strategy. In Proceedings of the Genetic and Evolutionary
Computation Conference, pages 1417–1425, 2024.
[301] Midhilesh Momidi, Priyanka S Chauhan, Adityaram Komaraneni, Surya Prakash Ghattamaneni, Kamal
Upreti, and Nishant Kumar. Uncertainty-aware molecular property prediction using heterogeneous molecular graph neural networks. In International Conference on Generative Artificial Intelligence, Cryptography,
and Predictive Analytics, pages 243–254. Springer, 2024.
[302] AA Salama, Huda E Khalid, Ahmed K Essa, and Nadheer M Ahmed. A natural language processing
environment for rule-based decision making with neutrosophic logic to manage uncertainty and ambiguity.
Neutrosophic Sets and Systems, 82(1):44, 2025.
[303] Diego Fernando Coka Flores, Ignacio Fernando Barcos Arias, María Elena Infante Miranda, and Omar Mar
Cornelio. Applying neutrosophic natural language processing to analyze complex phenomena in interdisciplinary contexts. Neutrosophic Sets and Systems, 74:297–305, 2024.
[304] Sultan AlGhozali and Siti Mukminatun. Natural language processing of gemini artificial intelligence powered chatbot. Balangkas: An International Multidisciplinary Research Journal, 1(1):41–48, 2024.
[305] Naeemeh Adel. Fuzzy natural language similarity measures through computing with words. PhD thesis,
Manchester Metropolitan University, 2022.
[306] Yenson Vinicio Mogro Cepeda, Marco Antonio Riofrío Guevara, Emerson Javier Jácome Mogro, and
Rachele Piovanelli Tizano. Impact of irrigation water technification on seven directories of the san juanpatoa river using plithogenic n-superhypergraphs based on environmental indicators in the canton of pujilí,
2021. Neutrosophic Sets and Systems, 74(1):6, 2024.
[307] Mohammad Hamidi, Florentin Smarandache, and Elham Davneshvar. Spectrum of superhypergraphs via
flows. Journal of Mathematics, 2022(1):9158912, 2022.
[308] Takaaki Fujita and Florentin Smarandache. Neutrosophic soft n-super-hypergraphs with real-world applications. Infinite Study, 2025.
[309] Takaaki Fujita and Florentin Smarandache. Soft directed n-superhypergraphs with some real-world applications. European Journal of Pure and Applied Mathematics, 18(4):6643–6643, 2025.
[310] Ajoy Kanti Das, Rajat Das, Suman Das, Bijoy Krishna Debnath, Carlos Granados, Bimal Shil, and Rakhal
Das. A comprehensive study of neutrosophic superhyper bci-semigroups and their algebraic significance.
Transactions on Fuzzy Sets and Systems, 8(2):80, 2025.
[311] Adel Al-Odhari. A brief comparative study on hyperstructure, super hyperstructure, and n-super superhyperstructure. Neutrosophic Knowledge, 6:38–49, 2025.
[312] Mohammad Hamidi and Mohadeseh Taghinezhad. Application of Superhypergraphs-Based Domination
Number in Real World. Infinite Study, 2023.
[313] Mohammad Hamidi, Florentin Smarandache, and Mohadeseh Taghinezhad. Decision Making Based on
Valued Fuzzy Superhypergraphs. Infinite Study, 2023.
[314] Takaaki Fujita, Atiqe Ur Rahman, Arkan A Ghaib, Talal Ali Al-Hawary, and Arif Mehmood Khattak. On
the properties and illustrative examples of soft superhypergraphs and rough superhypergraphs. Prospects
for Applied Mathematics and Data Analysis, 5(1):12–31, 2025.
140


# Page. 142

![Page Image](https://bcdn.docswell.com/page/4JZL61XNE3.jpg)

Bibliography
[315] Takaaki Fujita. Review of plithogenic directed, mixed, bidirected, and pangene offgraph. Advancing
Uncertain Combinatorics through Graphization, Hyperization, and Uncertainization: Fuzzy, Neutrosophic,
Soft, Rough, and Beyond, page 120, 2024.
[316] Takaaki Fujita. Recursive hypergraphs and recursive superhypergraphs: Exploring more hierarchical and
generalized graph concepts.
[317] Miguel Ortiz-Barrios, Natalia Jaramillo-Rueda, Andrea Espeleta-Aris, Berk Kucukaltan, and Llanos
Cuenca. Integrated fuzzy decision-making methodology with intuitionistic fuzzy numbers: An application for disaster preparedness in clinical laboratories. Expert Systems with Applications, 263:125712, 2025.
[318] Hongxing Li and Vincent C Yen. Fuzzy sets and fuzzy decision-making. CRC press, 1995.
[319] Dragan Pamucar, Morteza Yazdani, Radojko Obradovic, Anil Kumar, and Mercedes Torres-Jiménez. A
novel fuzzy hybrid neutrosophic decision-making approach for the resilient supplier selection problem.
International Journal of Intelligent Systems, 35(12):1934–1986, 2020.
[320] Arunodaya Raj Mishra, Dragan Pamucar, Pratibha Rani, Rajeev Shrivastava, and Ibrahim M. Hezam.
Assessing the sustainable energy storage technologies using single-valued neutrosophic decision-making
framework with divergence measure. Expert Syst. Appl., 238:121791, 2023.
[321] G Muhiuddin, Mohamed E Elnair, Satham Hussain, and Durga Nagarajan. Topsis method-based decisionmaking model for bipolar quadripartitioned neutrosophic environment. Neutrosophic Sets and Systems,
85:899–918, 2025.
[322] Muhammet Gul, Suleyman Mete, Faruk Serin, and Erkan Celik. Fine–kinney-based occupational risk assessment using single-valued neutrosophic topsis. In Fine–Kinney-Based Fuzzy Multi-criteria Occupational
Risk Assessment: Approaches, Case Studies and Python Applications, pages 111–133. Springer, 2020.
[323] Ting-Yu Chen and Chueh-Yung Tsao. The interval-valued fuzzy topsis method and experimental analysis.
Fuzzy sets and systems, 159(11):1410–1428, 2008.
[324] Manoj Mathew, Ripon Kumar Chakrabortty, and Michael J. Ryan. A novel approach integrating ahp and
topsis under spherical fuzzy sets for advanced manufacturing system selection. Eng. Appl. Artif. Intell.,
96:103988, 2020.
[325] Ali Azadeh, Morteza Saberi, Nasim Zandi Atashbar, Elizabeth Chang, and Peiman Pazhoheshfar. Z-ahp:
A z-number extension of fuzzy analytical hierarchy process. In 2013 7th IEEE International Conference
on Digital Ecosystems and Technologies (DEST), pages 141–147. IEEE, 2013.
[326] Hamid Reza Pourghasemi, Biswajeet Pradhan, and Candan Gokceoglu. Application of fuzzy logic and
analytical hierarchy process (ahp) to landslide susceptibility mapping at haraz watershed, iran. Natural
Hazards, 63:965–996, 2012.
[327] Mavera Nawaz, Arooj Adeel, and Muhammad Akram. Risk evaluation in failure mode and effect analysis:
Ahp-vikor method with picture fuzzy rough number. Granular Computing, 9(3):69, 2024.
[328] Xingang Wang, Yushui Geng, Peipei Yao, and Mengjie Yang. Multiple attribute group decision making
approach based on extended vikor and linguistic neutrosophic set. Journal of Intelligent &amp; Fuzzy Systems,
36(1):149–160, 2019.
[329] Serafim Opricovic and Gwo-Hshiung Tzeng. Compromise solution by mcdm methods: A comparative
analysis of vikor and topsis. Eur. J. Oper. Res., 156:445–455, 2004.
[330] Admin Admin, Luis A. Crespo Crespo-Berti, Haro Teran Lilian Fabiola, and Dinara Turaeva. Neutrosophic
decision making using saaty’s ahp method and vikor. Journal of Intelligent Systems and Internet of Things,
2024.
[331] Muhammad Riaz and Syeda Tayyba Tehrim. A robust extension of vikor method for bipolar fuzzy sets
using connection numbers of spa theory based metric spaces. Artificial Intelligence Review, 54:561 – 591,
2020.
141


# Page. 143

![Page Image](https://bcdn.docswell.com/page/YE6W2L49EV.jpg)

Soft set theory serves as a structured framework for parameterized
decision modeling by associating specific attributes with subsets of
a given universe, allowing for the effective representation of
uncertainty. Over the past several decades, this theory has expanded
significantly into various specialized models designed to handle
increasingly complex data structures. This book presents a surveystyle exploration of these developments, detailing core definitions
and representative constructions for extensions such as:
•
HyperSoft and SuperHyperSoft Sets: Models that capture
multi-attribute interactions and set-valued constraints.
•
TreeSoft and ForestSoft Sets: Hierarchical organizations
that model refined parameters across multiple levels.
•
Dynamic Soft Sets: Time-indexed families of soft sets that
model approximations as they evolve over time or context.
•
Uncertainty-Aware Models: Integrations with fuzzy sets,
intuitionistic fuzzy sets, and neutrosophic sets.
In addition to theoretical foundations, the book highlights key
applications in diverse fields, including decision-making (such as
AHP and TOPSIS), topology, matroid theory, and graph neural
networks. It aims to organize the vast amount of existing research
into a clear, accessible landscape for researchers and practitioners.


